How to copy an image to Amazon S3? - php

I have a problem a copying an image to Amazon S3
I am using the PHP copy function to copy the image from one server to other server ..it works on go-daddy host server. But it doesn't work for S3. Here is the code that is not working:
$strSource =http://img.youtube.com/vi/got6nXcpLGA/hqdefault.jpg
copy($strSource ,$dest);
$dest is my bucket url with folder present to upload images

I am not sure you could copy an image to AWS just like that. I would suggest using a library which talks to the AWS server and then running your commands.
Check this - http://undesigned.org.za/2007/10/22/amazon-s3-php-class
It provides a REST implementation for AWS.
For example, if you want to copy your image, you can do:
$s3 = new S3($awsAccessKey, $awsSecretKey);
$s3->copyObject($srcBucket, $srcName, $bucketName, $saveName, $metaHeaders = array(), $requestHeaders = array());
$awsAccessKey and $awsSecretKey are your secret keys for AWS a/c.
Check it out and hope it helps.

Not sure if you have used the AWS PHP SDK, but the AWS SDKs can come in handy in situations like this. The SDK can be used in conjunction with IAM roles to grant access to your S3 bucket. These are the steps:
Modify your code to use the PHP SDK to upload the files (if needed).
Create an IAM Role and grant the role permission to the needed S3 buckets.
When you start your EC2 instance, specify that you want to use the role.
Then your code will automatically use the permissions that you grant that role. IAM gives the instance temporary credentials that the SDK uses. These credentials are automatically rotated for you by IAM and EC2.

Here is my examnple from the documentation to copy an object in S3 Bucket
public function copyObject($sSourceKey, $sDestKey)
{
$this->checkKey($sSourceKey);
$this->checkKey($sDestKey);
$bRet = false;
// http://docs.aws.amazon.com/aws-sdk-php-2/latest/class-Aws.S3.S3Client.html#_copyObject
try {
$response = $this->_oS3Client->copyObject(
array(
'Bucket' => $this->getBucketName(),
'Key' => $sDestKey,
'CopySource' => urlencode($this->getBucketName() . '/' . $sSourceKey),
)
);
if (isset($response['LastModified'])) {
$bRet = true;
}
} catch (Exception $e) {
$GLOBALS['error'] = 1;
$GLOBALS["info_msg"][] = __METHOD__ . ' ' . $e->getMessage();
$bRet = false;
}
return $bRet;
}

Related

Upload to AWS S3 'folder' inside bucket with PHP

Apologies, I'm not an AWS user. I need to upload a file to a 'folder' in an existing bucket within AWS S3. I understand that there is no concept of 'folders' within S3, so how I could specify the 'folder' it needs to go to?
My uploaded image needs to follow an existing structure of
https://s3.amazonaws.com/my.bucket.name/newfolder/test_image.png
The code I have below works, but it puts the image into the root bucket, whereas I need to put it into a new folder ($newfolder).
Does anyone know where in the below code I could specify the $newfolder to achieve this?
Thank you.
<?php
$newfolder = "newfolder";
$bucket = 'my.bucket.name';
require_once('S3.php');
$awsAccessKey = 'MyAccessKey';
$awsSecretKey = 'MySecretKey';
$s3 = new S3($awsAccessKey, $awsSecretKey);
$s3->putBucket($bucket, S3::ACL_PUBLIC_READ);
$actual_image_name = 'test_image.png';
if($s3->putObjectFile('/var/www/html/test/image.png', $bucket , $actual_image_name, S3::ACL_PUBLIC_READ) )
{
$image='http://'.$bucket.'.s3.amazonaws.com/'.$actual_image_name;
}
else
{
echo 'error uploading to S3 Amazon';
}
?>
$actual_image_name = 'newfolder/test_image.png';
S3 calls this the key inside the bucket. The key can contain slashes, which would be treated as folders when viewed using an appropriate client. But the concept of "folders" doesn't matter much when dealing with the S3 API nor URLs, so it's just "keys" with slashes in them.

Upload File to Dropbox from User's Machine using PHP API

I am newbie to Dropbox API. I have completed the code which upload the file (on my website host) to my Dropbox with a given token. The process runs successfully.
I want to design a page which allows the user select a file (from his/her local machine) and upload directly to my dropbox.
I have an idea that the controller will upload the file to host first then upload to dropbox. However this idea sucks as it takes more time and bandwidth to complete. And I have to delete the file on host after uploading.
This is the code which works on my host:
<?php
require_once "dropbox-sdk/lib/Dropbox/autoload.php";
use \Dropbox as dbx;
$dropbox_config = array(
'key' => 'my key',
'secret' => 'my secret key'
);
$appInfo = dbx\AppInfo::loadFromJson($dropbox_config);
$webAuth = new dbx\WebAuthNoRedirect($appInfo, "PHP-Example/1.0");
$accessToken = 'my token code is given here';
$dbxClient = new dbx\Client($accessToken, "PHP-Example/1.0");
// Uploading the file
$f = fopen("working-draft.txt", "rb");
$result = $dbxClient->uploadFile("/working-draft.txt", dbx\WriteMode::add(), $f);
fclose($f);
//print_r($result);
// Get file info
$file = $dbxClient->getMetadata('/working-draft.txt');
// sending the direct link:
$dropboxPath = $file['path'];
$pathError = dbx\Path::findError($dropboxPath);
if ($pathError !== null) {
fwrite(STDERR, "Invalid <dropbox-path>: $pathError\n");
die;
}
// The $link is an array!
$link = $dbxClient->createTemporaryDirectLink($dropboxPath);
$dw_link = $link[0]."?dl=1";
echo "Download link: ".$dw_link."<br>";
?>
And I am using Codeigniter
I have an idea that the controller will upload the file to host first then upload to dropbox. However this idea sucks as it takes more time and bandwidth to complete.
Unfortunately, this is the only secure way to do this. Adding a file to your Dropbox requires knowing an OAuth access token for your account, and you can't allow others to know that token. (It would give them control of the account.) So the token needs to be kept secret, on your server. That means the file needs to be uploaded to your server and transferred to Dropbox from there.

How to rename or move a file in Google Cloud Storage (PHP API)

I am currently trying to rename and/or move a cloud storage file to another name/position, but I can't get it to work. I am using https://github.com/google/google-api-php-client as client, the uploads works fine with:
...
$storageService = new \Google_Service_Storage( $client )
$file = new \Google_Service_Storage_StorageObject()
$file->setName( 'test.txt' );
$storageService->objects->insert(
$bucketName,
$file,
array(
'name' => $filename,
'data' => file_get_contents( $somefile )
)
);
...
So I have tried to change a filename by the $storageObject->objects->update() method, but I cannot find any documentation on this. I used $storageService->objects->get( $bucketName, $fileName ) to get a specific file I wanted to rename (with $file->setName()), but it seems I just cannot pass the file to the objects->update function. Am I doing it wrong?
Ok, it seems I cannot directly rename a file (please correct me if I'm wrong), I could only update the metadata. I managed to get it to work by copying the file to a new filename/destination and then delete the old file. I successfully used $storageService->objects->copy and $storageService->objects->delete for this. This doesn't feels right but at least it works.
As this is not very well documented with google, here a basic example:
//RENAME FILE ON GOOGLE CLOUD STORAGE (GCS)
//Get client and auth token (might vary depending on the way you connect to gcs – here with laravel framework facade)
//DOC: https://cloud.google.com/storage/docs/json_api/v1/json-api-php-samples
//DOC: https://developers.google.com/api-client-library/php/auth/service-accounts
//Laravel Client: https://github.com/pulkitjalan/google-apiclient
//Get google client
$gc = \Google::getClient();
//Get auth token if it is not valid/not there yet
if($gc->isAccessTokenExpired())
$gc->getAuth()->refreshTokenWithAssertion();
//Get google cloud storage service with the client
$gcStorageO = new \Google_Service_Storage($gc);
//GET object at old position ($path)
//DOC: https://cloud.google.com/storage/docs/json_api/v1/objects/get
$oldObj = $gcStorageO->objects->get($bucket, $path);
//COPY desired object from old position ($path) to new position ($newpath)
//DOC: https://cloud.google.com/storage/docs/json_api/v1/objects/copy
$gcStorageO->objects->copy(
$bucket, $path,
$bucket, $newpath,
$oldObj
);
//DELETE old object ($path)
//DOC: https://cloud.google.com/storage/docs/json_api/v1/objects/delete
$gcStorageO->objects->delete($bucket, $path);
I found that when using gcutils in conjunction with PHP, you can execute pretty much every php file command on app engine. Copy, delete, check if file exists.
if(file_exists("gs://$bucket/{$folder}/$old_temp_file")){
$old_path = "gs://$bucket/{$folder}/$old_temp_file";
$new_permanent_path = "gs://$bucket/{$folder}/$new_permanent_file";
copy($old_path, $new_permanent_path);
unlink($old_path);
}

How to copy a file from a website to amazon bucket using zend service amazon?

I need to copy a resource from a website to my s3-bucket. For example an image like this 'http://upload.wikimedia.org/wikipedia/commons/6/63/Wikipedia-logo.png' I need to copy this to a folder in my s3-bucket. Is this possible using Zend_Service_Amazon?
You have to use stream wrappers.
I haven't dealt with Image files but hope it will work .
$s3 = new Zend_Service_Amazon_S3($my_aws_key, $my_aws_secret_key);
$s5="s".rand();
$s3->registerStreamWrapper($s5);
//$bucketname- your bucket name
mkdir($s5."://".$bucketname);
//$path - where you want to store your file including bucketname
$s1=$s5."://".$path;
$filedata = file_get_contents('yourimage url');
file_put_contents($s1, $fildata);

Seeing if object exists in S3 using PHP

I am using PHP and I am using the S3 API to upload a file, but I wanted to make sure that this exact filename doesn't already exist in the bucket before upload.
I have found a few examples online that use "file_get_contents" but doesn't this mean that you would have to download the entire file first? Usually, these files are about 10 mb, so ideally, I wouldn't really want to do this.
Is there perhaps a way to use "file_get_contents" without downloading the file?
Or better yet, perhaps I could use an API request to see if the filename exists?
It's not important to me whether or not the content, or filesize, is the same, just the filename.
Gets whether or not the specified Amazon S3 object exists in the specified bucket.
AmazonS3 doesObjectExist
$s3 = new AmazonS3();
$bucket = 'my-bucket' . strtolower($s3->key);
$response = $s3->doesObjectExist($bucket, 'test1.txt');
// Success? (Boolean, not a CFResponse object)
var_dump($response);
try to use code below:
$s3 = new S3();
$info = $s3->getObjectInfo($bucket, $filename);
if ($info)
{
echo 'File exists';
}
else
{
echo 'File does not exists';
}
download the S3 SDK from amazon for php. There is a class called S3; create an object of S3. The object will allow to call the getObjectInfo() method. Pass your S3 bucket name and the file name (often the file name is referred as key). The getObjectInfo() method will return some information if the file exists, otherwise the method will return FALSE.
Please note that the other suggestions are based on version 1 of the AWS SDK for PHP. For version 2, you'll want to be familiar with the latest guide found here:
http://docs.aws.amazon.com/aws-sdk-php/guide/latest/index.html
The "Getting Started" section in the link above will help you get the SDK installed and setup, so be sure to take your time reading through those docs if you haven't done so already. When you're done with the setup, you'll want to be familiar with the stream wrapper method found here:
http://docs.aws.amazon.com/aws-sdk-php/guide/latest/feature-s3-stream-wrapper.html
Finally, below is a brief, real-life example of how you could use it in the flow of your code.
require('vendor/autoload.php');
// your filename
$filename = 'my_file_01.jpg';
// this will use AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY from env vars
$s3 = Aws\S3\S3Client::factory();
// S3_BUCKET must also be defined in env vars
$bucket = getenv('S3_BUCKET')?: die('No "S3_BUCKET" config var in found in env!');
// register stream wrapper method
$s3->registerStreamWrapper();
// does file exist
$keyExists = file_exists("s3://".$bucket."/".$filename);
if ($keyExists) {
echo 'File exists!';
}
If you have or have the ability to install the PECL extension HTTP then you can use http_head to make a head request easily and check whether the response was 200 or 404.
Updated version for anyone looking for v3 and up...
$s3Client = new \Aws\S3\S3Client([
'version' => 'latest',
'region' => getenv('AWS_REGION'),
'credentials' => [
'key' => getenv('AWS_KEY'),
'secret' => getenv('AWS_SECRET')
]
]);
$response = $s3Client->doesObjectExist(getenv('AWS_S3_BUCKET'),'somefolder/somefile.ext');
if ($response) {
echo "Yay, it exists :)";
} else {
echo "Boo, nothing there :(";
}

Categories