Uploading Objects to AWS S3 with SDK 3 for PHP - php

For many years my PHP application has used the AWS SDK 2 for PHP and now we are considering switching to SDK 3.
However, looking in the SDK documentation we couldn't find any simple example, they all talk about multipart and other things that are very different from what we have today.
The code below is what we have for SDK 2, how would a simple object be uploaded to a bucket using SDK3?
<?php
define('S3_KEY', '');
define('S3_SECRET', '');
define('S3_BUCKET_NAME', '');
$s3 = S3Client::factory(array(
'key' => S3_KEY,
'secret' => S3_SECRET
));
$filename = 'example/001/1.jpg';
try {
$resource = fopen($origin, 'r');
$return = $s3->upload(S3_BUCKET_NAME, $filename, $resource, 'public-read');
return $return['ObjectURL'];
} catch (S3Exception $e) {
return false;
}

In AWS SDK for PHP Version 3, ObjectUploader can upload a file to Amazon S3 using either PutObject or MultipartUploader, depending on what is best based on the payload size.
Below is the sample code, you can use to upload objects into the S3 bucket:-
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\Exception\AwsException;
use Aws\S3\ObjectUploader;
$s3Client = new S3Client([
'profile' => 'default',
'region' => 'us-east-2',
'version' => '2006-03-01'
]);
$bucket = 'your-bucket';
$key = 'my-file.zip';
// Using stream instead of file path
$source = fopen('/path/to/file.zip', 'rb');
$uploader = new ObjectUploader(
$s3Client,
$bucket,
$key,
$source
);
do {
try {
$result = $uploader->upload();
if ($result["#metadata"]["statusCode"] == '200') {
print('<p>File successfully uploaded to ' . $result["ObjectURL"] . '.</p>');
}
print($result);
} catch (MultipartUploadException $e) {
rewind($source);
$uploader = new MultipartUploader($s3Client, $source, [
'state' => $e->getState(),
]);
}
} while (!isset($result));
fclose($source);
From Version 3 of the SDK Key differences:
Use new instead of factory() to instantiate the client.
The 'version' and 'region' options are required during
instantiation.

Related

Does AWS PHP SDK automatically retry multipart uploads?

Based on the sdk code, the s3 client code uses retry logic, but the sample code from the docs suggest doing a loop until the multipart upload finishes correctly.
$s3Client = new S3Client([
'profile' => 'default',
'region' => 'us-east-2',
'version' => '2006-03-01'
]);
$bucket = 'your-bucket';
$key = 'my-file.zip';
// Using stream instead of file path
$source = fopen('/path/to/large/file.zip', 'rb');
$uploader = new ObjectUploader(
$s3Client,
$bucket,
$key,
$source
);
do {
try {
$result = $uploader->upload();
if ($result["#metadata"]["statusCode"] == '200') {
print('<p>File successfully uploaded to ' . $result["ObjectURL"] . '.</p>');
}
print($result);
} catch (MultipartUploadException $e) {
rewind($source);
$uploader = new MultipartUploader($s3Client, $source, [
'state' => $e->getState(),
]);
}
} while (!isset($result));
Is that MultipartUploadException being thrown after the standard 3 retries for it have happened? Or are multipart uploads not covered by the retry policy?

Upload file to Amazon EC2 server from website by PHP

I have a website ( bedatify.com) and I want to make a page within which people could upload their images to my amazon EC2 server.
I checked similar questions like
Unable to upload files on Amazon EC2 - php and how to upload to files to amazon EC2
but I don't figure it out how to manage it!
Is this piece f code a good start? What should I change to let user upload pictures directly to my EC2 server from my website?
<?php
if(isset($_POST['image'])){
echo "in";
$image = $_POST['image'];
upload($_POST['image']);
exit;
}
else{
echo "image_not_in";
exit;
}
function upload($image){
$now = DateTime::createFromFormat('U.u', microtime(true));
$id = "pleeease";
$upload_folder = "/var/www/html/upload";
$path = "$upload_folder/$id.jpg";
if(file_put_contents($path, base64_decode($image)) != false){
echo "uploaded_success"
}
else{
echo "uploaded_failed";
}
}
?>
Just a Tipp:
This is a perfect use case for S3.
So upload and retreive it from S3 within you Php Backend.
If you upload it to your EC2 Instance the static files can fill up your instance space. What if the instance get terminated?
There is a PHP SDK you can use:
https://aws.amazon.com/de/sdk-for-php/
An example would be:
use Aws\S3\MultipartUploader;
use Aws\Exception\MultipartUploadException;
$uploader = new MultipartUploader($s3Client, '/path/to/large/file.zip', [
'bucket' => 'your-bucket',
'key' => 'my-file.zip',
]);
try {
$result = $uploader->upload();
echo "Upload complete: {$result['ObjectURL']}\n";
} catch (MultipartUploadException $e) {
echo $e->getMessage() . "\n";
}
I hope that helps!
Dominik
uploadfile.php
<?php
$IAM_KEY = 'xxxx';
$IAM_SECRET = 'xxxx';
$bucket = 'xxxx';
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1',
'credentials' => [
'key' => $IAM_KEY,
'secret' => $IAM_SECRET
]
]);
$file = $_FILES["fileToUpload"]["tmp_name"];
try {
// Upload data.
$result = $s3->putObject([
'Bucket' => $bucket,
'Key' => 'xxx',
'SourceFile' => $file
]);
// Print the URL to the object.
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
?>
index.html
<form action="/AWS/uploadfile.php" method="post" enctype="multipart/form-data">
Select image to upload:
<input type="file" name="fileToUpload" id="fileToUpload">
<input type="submit" value="Upload Image" name="submit">
</form>

AWS s3 Listing Object Keys Using PHP

I'm working on Listing Object Keys Using PHP using AWS s3.
I have the bucket setup with one test file loaded:
Screenshot of my s3 bucket with one test file
Here is the reference that I'm using from AWS documentation:
AWS ListingObjectKeysUsingPHP webpage
Here is the code that I'm using from the page:
<?php
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'apollos-integrations-public';
// Instantiate the client.
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1'
]);
// Use the high-level iterators (returns ALL of your objects).
try {
$objects = $s3->getPaginator('ListObjects', [
'Bucket' => $bucket
]);
//var_dump($objects);
echo "Keys retrieved!" . PHP_EOL;
foreach ($objects as $object) {
echo $object['Key'] . PHP_EOL;
}
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
// Use the plain API (returns ONLY up to 1000 of your objects).
try {
$result = $s3->listObjects([
'Bucket' => $bucket
]);
echo "Keys retrieved!" . PHP_EOL;
foreach ($result['Contents'] as $object) {
echo $object['Key'] . PHP_EOL;
}
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
?>
When I execute this PHP script, it simply returns: "Keys retrieved!" without the test file listed. It should list the files in the folder.
When I add a "var_dump($objects);" (commented in the script, so when I uncomment it), then it returns a lot of data for the object as shown in this screenshot:
image for the object var dump
For some reason, this part of the code provided by Amazon is not working:
foreach ($objects as $object) {
echo $object['Key'] . PHP_EOL;
}
Why isn't this code provided by AWS working?
Shouldn't the key and secret be required?
Please help
Your problem was most likely caused by PHP AWS SDK not being able to get the credentials from your environment variables --> the credentials are either taken from environment variables, or you should specify them directly in your code like this:
// Instantiate the client.
$credentials = new Aws\Credentials\Credentials('<KEY>', '<SECRET>');
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1',
'credentials' => $credentials
]);
(Replace '<KEY>' and '<SECRET>' with your credentials.). You can find the whole script bellow.
More information can be found here: https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/guide_configuration.html
Whole script (notice first two rows, where I enabled debugging info. This will show you error messages in case there is some problem - like in this case with the credentials):
<?php
error_reporting(-1);
ini_set('display_errors', 'On');
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'apollos-integrations-public';
// Instantiate the client.
$credentials = new Aws\Credentials\Credentials('<KEY>', '<SECRET>');
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1',
'credentials' => $credentials
]);
// Use the high-level iterators (returns ALL of your objects).
try {
$objects = $s3->getPaginator('ListObjects', [
'Bucket' => $bucket
]);
echo "Keys retrieved!" . PHP_EOL;
foreach ($objects as $object) {
echo $object['Key'] . PHP_EOL;
}
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
// Use the plain API (returns ONLY up to 1000 of your objects).
try {
$result = $s3->listObjects([
'Bucket' => $bucket
]);
echo "Keys retrieved!" . PHP_EOL;
foreach ($result['Contents'] as $object) {
echo $object['Key'] . PHP_EOL;
}
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
BTW. this is the error message which you would see if you enable debugging info:
Fatal error: Uncaught Aws\Exception\CredentialsException: Error retrieving credentials from the instance profile metadata server.
Hopefully you find it helpful.

How to check if 2 images with different file names are identical with PHP

I am getting images from the WEB and sending them to an S3 bucket on AWS. Each image gets a random name. But I don't want to store identical twins. So, before storing the image on my server (to then sending it to S3), I need to check if an identical image already exists there (on S3).
I cannot do the check on my server, because files will be deleted after successfully sent to S3.
Here is my code:
$filenameIn = $url_img;
$nome = randomString();
$nome_img = "$nome.png";
$filenameOut = __DIR__ . '/img/' . $nome_img;
$contentOrFalseOnFailure = file_get_contents($filenameIn);
$bucket = 'bucket-img';
if( !$contentOrFalseOnFailure ) {
return "error1: ...";
}
$byteCountOrFalseOnFailure = file_put_contents($filenameOut, $contentOrFalseOnFailure);
if( !$byteCountOrFalseOnFailure ) {
return "error2: ...";
}
// Instantiate the S3 client with your AWS credentials
$aws = new Aws\S3\S3Client([
'version' => 'latest',
'region' => 'sa-east-1',
'credentials' => array(
'key' => 'omissis',
'secret' => 'omissis',
)
]);
/*
I BELIEVE AT THIS POINT I SHOULD MAKE A COMPARISON TO CHECK IF THE FILE THAT
IS ABOUT TO BE SENT TO S3 HAS AN IDENTICAL TWIN THERE, REGARDLESS OF THEIR
RANDOM NAME. IF SO, UPLOAD SHOULD BE ABORTED AND THE SCRIPT SHOULD PROVIDE
ME WITH THE URL OF THE TWIN.
*/
try{
// Send a PutObject request and get the result object.
$result = $aws->putObject([
'Bucket' => $bucket,
'Key' => $nome_img,
'SourceFile' => $filenameOut,
'ACL' => 'public-read'
]);
$r = $aws->getObjectUrl($bucket, $nome_img);
return $r;
} catch (S3Exception $e) {
return $e->getMessage() . "\n";
}
As I said in the code, I should get the url of the twin file, if it exists.
Is that possible?

Using PHP to upload to Amazon S3

I've spent the last few hours following tutorials for implementing file uploads to Amazon S3 using php. I uploaded the most recent version of Donovan Schönknecht's S3 class to my server (as S3.php) and I am trying to use the following code to test upload capability. I know this code will work because I've seen numerous examples in action.
<?php
require('S3.php');
$s3 = new S3('KEY', 'SECRET KEY');
//insert into s3
$new_name = time() . '.txt';
S3::putObject(
'upload-me.txt',
'bucketName',
$new_name,
S3::ACL_PUBLIC_READ,
array(),
array(),
S3::STORAGE_CLASS_RRS
);
?>
I get an error 500 server error when I attempt to load this page. Additionally, every other reputable tutorial of this nature has given me the same error 500.
I verified that my key and secret key are valid by connecting to S3 with Cyberduck.
Does anyone have a clue as to what I could be doing incorrectly?
Thanks,
Sean
As it turns out, I was missing the cURL extension for PHP and this was causing an issue as the S3 class I was using required the use of cURL. All is working now.
You should also consider using the official AWS SDK for PHP. Examples for using S3 with the SDK can be found in their S3 user guide.
You can download most recent version of Amazon PHP SDK by running following composer command
composer require aws/aws-sdk-php
Further configuration to upload file on Amazon s3 are following
// Include the SDK using the Composer autoloader
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
// Set Amazon s3 credentials
$client = S3Client::factory(
array(
'key' => "your-key",
'secret' => "your secret key"
)
);
try {
$client->putObject(array(
'Bucket'=>'your-bucket-name',
'Key' => 'your-filepath-in-bucket',
'SourceFile' => 'source-filename-with-path',
'StorageClass' => 'REDUCED_REDUNDANCY'
));
} catch (S3Exception $e) {
// Catch an S3 specific exception.
echo $e->getMessage();
}
Get step by step details from here Amazon S3 File Upload Using PHP
Following example worked for me:
<?php
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$client = S3Client::factory([
'version' => 'latest',
'region' => 'us-west-1',
'credentials' => [
'key' => "<scret-key>",
'secret' => "<my-secret>"
]
]);
try {
$client->putObject([
'Bucket' =>'<my-bucket-name>',
'Key' => '<file-name>',
'SourceFile' => '<file-path-on-server>', // like /var/www/vhosts/mysite/file.csv
'ACL' => 'public-read',
]);
} catch (S3Exception $e) {
// Catch an S3 specific exception.
echo $e->getMessage();
}
Getting security credentials:
https://aws.amazon.com/blogs/security/wheres-my-secret-access-key/
https://console.aws.amazon.com/iam/home?#/security_credential
Getting region code
https://docs.aws.amazon.com/general/latest/gr/rande.html
Use this one to Upload Images using a form and it's working Fine for me
you may try using it with your code
$name = $_FILES['photo']['name'];
$size = $_FILES['photo']['size'];
$tmp = $_FILES['photo']['tmp_name'];
//////Upload Process
// Bucket Name
$bucket = 'bucket-name';
require_once('S3.php');
//AWS access info
$awsAccessKey = 'awsAccessKey';
$awsSecretKey = 'awsSecretKey';
//instantiate the class
$s3 = new S3($awsAccessKey, $awsSecretKey);
$s3->putBucket($bucket, S3::ACL_PUBLIC_READ);
//Rename image name.
$actual_image_name = time();
//Upload to S3
if($s3->putObjectFile($tmp, $bucket , $actual_image_name, S3::ACL_PUBLIC_READ) )
{
$image='http://'.$bucket.'.s3.amazonaws.com/'.$actual_image_name;
}else{
echo 'error uploading to S3 Amazon';
}
I never found a updated Script with Amazons latest sdk. i have made it by myself. it woks as a php commandline interpreter script. give it a try :
https://github.com/arizawan/aiss3clientphp
I'm not familiar with S3 API, but i used it as the storage with https://github.com/KnpLabs/Gaufrette. Gaufrette is a library that provides pretty nice abstraction layer above S3 and other file services/systems.
Here is sample code to upload images on Amazon S3.
// Bucket Name
$bucket="BucketName";
if (!class_exists('S3'))require_once('S3.php');
//AWS access info
if (!defined('awsAccessKey')) define('awsAccessKey', 'ACCESS_KEY');
if (!defined('awsSecretKey')) define('awsSecretKey', 'ACCESS_Secret_KEY');
$s3 = new S3(awsAccessKey, awsSecretKey);
$s3->putBucket($bucket, S3::ACL_PUBLIC_READ);
if($s3->putObjectFile($tmp, $bucket , $image_name_actual,S3::ACL_PUBLIC_READ) )
{
$message = "S3 Upload Successful.";
$s3file='http://'.$bucket.'.s3.amazonaws.com/'.$actual_image_name;
echo "<img src='$s3file'/>";
echo 'S3 File URL:'.$s3file;
}
else{
$message = "S3 Upload Fail.";
}
}
Below is the best solution. Its using multipart upload.Make Sure to install Aws SDK for PHP before using
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\Exception\AwsException;
use Aws\S3\MultipartUploader;
use Aws\Exception\MultipartUploadException;
try {
$s3Client = new S3Client([
'version' => 'latest',
'region' => 'us-east-1',
'credentials' => [
'key' => 'AKIAUMIZJR5U5IO7M3',
'secret' => 'BmcA3vFso1bc/9GVK7nHJtFk0tQL6Vi5OoMySO',
],
]);
// Use multipart upload
$source = 'https://b8q9h6y2.stackpathcdn.com/wp-content/uploads/2016/08/banner-for-website-4.png';
$uploader = new MultipartUploader($s3Client, $source, [
'bucket' => 'videofilessandeep',
'key' => 'my-file.png',
'ACL' => 'public-read',
]);
try {
$result = $uploader->upload();
echo "Upload complete: {$result['ObjectURL']}\n";
} catch (MultipartUploadException $e) {
echo $e->getMessage() . "\n";
}
} catch (Exception $e) {
print_r($e);
}

Categories