To search images in S3 folder by keywords - php

What is the best and simple way to search and get images list from S3 by using English keywords. Or do I have to use the Rekognition to store all the image metadatas into database?
My development is using Php.

<?php
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = '*** Your Bucket Name ***';
// Instantiate the client.
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1'
]);
// Use the high-level iterators (returns ALL of your objects).
try {
$objects = $s3->getPaginator('ListObjects', [
'Bucket' => $bucket
]);
echo "Keys retrieved!" . PHP_EOL;
foreach ($objects as $object) {
echo $object['Key'] . PHP_EOL;
}
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
// Use the plain API (returns ONLY up to 1000 of your objects).
try {
$result = $s3->listObjects([
'Bucket' => $bucket
]);
echo "Keys retrieved!" . PHP_EOL;
foreach ($result['Contents'] as $object) {
echo $object['Key'] . PHP_EOL;
}
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
so this code will return you all objects in your bucket, you can add logic that only iff the key contains the extension 'jpg', 'jpeg' , and 'png' then only you have to print the key/name of object

You can list all the objects and find the ones you are looking for
In pseudo-code
connect to S3
list thru all buckets ( or specify a bucket )
for object in bucket.objects.all
if object.key = your search criteria
do something
I have a code does this for me in python, let me know if you like me to post it, since you are using php I gave you the logic in pseudo code above

Related

Uploading Objects to AWS S3 with SDK 3 for PHP

For many years my PHP application has used the AWS SDK 2 for PHP and now we are considering switching to SDK 3.
However, looking in the SDK documentation we couldn't find any simple example, they all talk about multipart and other things that are very different from what we have today.
The code below is what we have for SDK 2, how would a simple object be uploaded to a bucket using SDK3?
<?php
define('S3_KEY', '');
define('S3_SECRET', '');
define('S3_BUCKET_NAME', '');
$s3 = S3Client::factory(array(
'key' => S3_KEY,
'secret' => S3_SECRET
));
$filename = 'example/001/1.jpg';
try {
$resource = fopen($origin, 'r');
$return = $s3->upload(S3_BUCKET_NAME, $filename, $resource, 'public-read');
return $return['ObjectURL'];
} catch (S3Exception $e) {
return false;
}
In AWS SDK for PHP Version 3, ObjectUploader can upload a file to Amazon S3 using either PutObject or MultipartUploader, depending on what is best based on the payload size.
Below is the sample code, you can use to upload objects into the S3 bucket:-
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\Exception\AwsException;
use Aws\S3\ObjectUploader;
$s3Client = new S3Client([
'profile' => 'default',
'region' => 'us-east-2',
'version' => '2006-03-01'
]);
$bucket = 'your-bucket';
$key = 'my-file.zip';
// Using stream instead of file path
$source = fopen('/path/to/file.zip', 'rb');
$uploader = new ObjectUploader(
$s3Client,
$bucket,
$key,
$source
);
do {
try {
$result = $uploader->upload();
if ($result["#metadata"]["statusCode"] == '200') {
print('<p>File successfully uploaded to ' . $result["ObjectURL"] . '.</p>');
}
print($result);
} catch (MultipartUploadException $e) {
rewind($source);
$uploader = new MultipartUploader($s3Client, $source, [
'state' => $e->getState(),
]);
}
} while (!isset($result));
fclose($source);
From Version 3 of the SDK Key differences:
Use new instead of factory() to instantiate the client.
The 'version' and 'region' options are required during
instantiation.

AWS s3 Listing Object Keys Using PHP

I'm working on Listing Object Keys Using PHP using AWS s3.
I have the bucket setup with one test file loaded:
Screenshot of my s3 bucket with one test file
Here is the reference that I'm using from AWS documentation:
AWS ListingObjectKeysUsingPHP webpage
Here is the code that I'm using from the page:
<?php
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'apollos-integrations-public';
// Instantiate the client.
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1'
]);
// Use the high-level iterators (returns ALL of your objects).
try {
$objects = $s3->getPaginator('ListObjects', [
'Bucket' => $bucket
]);
//var_dump($objects);
echo "Keys retrieved!" . PHP_EOL;
foreach ($objects as $object) {
echo $object['Key'] . PHP_EOL;
}
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
// Use the plain API (returns ONLY up to 1000 of your objects).
try {
$result = $s3->listObjects([
'Bucket' => $bucket
]);
echo "Keys retrieved!" . PHP_EOL;
foreach ($result['Contents'] as $object) {
echo $object['Key'] . PHP_EOL;
}
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
?>
When I execute this PHP script, it simply returns: "Keys retrieved!" without the test file listed. It should list the files in the folder.
When I add a "var_dump($objects);" (commented in the script, so when I uncomment it), then it returns a lot of data for the object as shown in this screenshot:
image for the object var dump
For some reason, this part of the code provided by Amazon is not working:
foreach ($objects as $object) {
echo $object['Key'] . PHP_EOL;
}
Why isn't this code provided by AWS working?
Shouldn't the key and secret be required?
Please help
Your problem was most likely caused by PHP AWS SDK not being able to get the credentials from your environment variables --> the credentials are either taken from environment variables, or you should specify them directly in your code like this:
// Instantiate the client.
$credentials = new Aws\Credentials\Credentials('<KEY>', '<SECRET>');
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1',
'credentials' => $credentials
]);
(Replace '<KEY>' and '<SECRET>' with your credentials.). You can find the whole script bellow.
More information can be found here: https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/guide_configuration.html
Whole script (notice first two rows, where I enabled debugging info. This will show you error messages in case there is some problem - like in this case with the credentials):
<?php
error_reporting(-1);
ini_set('display_errors', 'On');
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'apollos-integrations-public';
// Instantiate the client.
$credentials = new Aws\Credentials\Credentials('<KEY>', '<SECRET>');
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1',
'credentials' => $credentials
]);
// Use the high-level iterators (returns ALL of your objects).
try {
$objects = $s3->getPaginator('ListObjects', [
'Bucket' => $bucket
]);
echo "Keys retrieved!" . PHP_EOL;
foreach ($objects as $object) {
echo $object['Key'] . PHP_EOL;
}
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
// Use the plain API (returns ONLY up to 1000 of your objects).
try {
$result = $s3->listObjects([
'Bucket' => $bucket
]);
echo "Keys retrieved!" . PHP_EOL;
foreach ($result['Contents'] as $object) {
echo $object['Key'] . PHP_EOL;
}
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
BTW. this is the error message which you would see if you enable debugging info:
Fatal error: Uncaught Aws\Exception\CredentialsException: Error retrieving credentials from the instance profile metadata server.
Hopefully you find it helpful.

How to check if 2 images with different file names are identical with PHP

I am getting images from the WEB and sending them to an S3 bucket on AWS. Each image gets a random name. But I don't want to store identical twins. So, before storing the image on my server (to then sending it to S3), I need to check if an identical image already exists there (on S3).
I cannot do the check on my server, because files will be deleted after successfully sent to S3.
Here is my code:
$filenameIn = $url_img;
$nome = randomString();
$nome_img = "$nome.png";
$filenameOut = __DIR__ . '/img/' . $nome_img;
$contentOrFalseOnFailure = file_get_contents($filenameIn);
$bucket = 'bucket-img';
if( !$contentOrFalseOnFailure ) {
return "error1: ...";
}
$byteCountOrFalseOnFailure = file_put_contents($filenameOut, $contentOrFalseOnFailure);
if( !$byteCountOrFalseOnFailure ) {
return "error2: ...";
}
// Instantiate the S3 client with your AWS credentials
$aws = new Aws\S3\S3Client([
'version' => 'latest',
'region' => 'sa-east-1',
'credentials' => array(
'key' => 'omissis',
'secret' => 'omissis',
)
]);
/*
I BELIEVE AT THIS POINT I SHOULD MAKE A COMPARISON TO CHECK IF THE FILE THAT
IS ABOUT TO BE SENT TO S3 HAS AN IDENTICAL TWIN THERE, REGARDLESS OF THEIR
RANDOM NAME. IF SO, UPLOAD SHOULD BE ABORTED AND THE SCRIPT SHOULD PROVIDE
ME WITH THE URL OF THE TWIN.
*/
try{
// Send a PutObject request and get the result object.
$result = $aws->putObject([
'Bucket' => $bucket,
'Key' => $nome_img,
'SourceFile' => $filenameOut,
'ACL' => 'public-read'
]);
$r = $aws->getObjectUrl($bucket, $nome_img);
return $r;
} catch (S3Exception $e) {
return $e->getMessage() . "\n";
}
As I said in the code, I should get the url of the twin file, if it exists.
Is that possible?

Unable to open using mode r: fopen(): AWS Elastic Beanstalk

Error:Unable to open using mode r: fopen(): Filename cannot be empty I keep getting this error when I try to upload larger files (more than 5MB). I have uploaded the PHP app to AWS Elastic Beanstalk and I upload the files to AWS S3. I don't even have fopen() in the code.
Alson when I test the site using XAMPP I don't get this error.
This is the code I use to upload file to S3:
<?php
session_start();
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
// AWS Info
$bucketName = 'tolga20.images';
$IAM_KEY = '******************';
$IAM_SECRET = '*************************';
$feedback = '';
$unqiue_num = mt_rand(1000, 9999);
if(isset($_FILES['fileToUpload'])) {
$user_set_id = $_POST['user_set_id'];
// Connect to AWS
try {
// You may need to change the region. It will say in the URL when the bucket is open
// and on creation.
$s3 = S3Client::factory(
array(
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET
),
'version' => 'latest',
'region' => 'eu-west-2'
)
);
} catch (Exception $e) {
// We use a die, so if this fails. It stops here. Typically this is a REST call so this would
// return a json object.
die("Error: " . $e->getMessage());
}
$temp_name = explode(".", $_FILES["fileToUpload"]["name"]);
$newfilename = $unqiue_num . "-" . $user_set_id . '.' . end($temp_name);
// For this, I would generate a unqiue random string for the key name. But you can do whatever.
$keyName = 'images/' . basename($newfilename);
$pathInS3 = 'https://s3.eu-west-2.amazonaws.com/' . $bucketName . '/' . $keyName;
// Add it to S3
try {
// Uploaded:
$file = $_FILES["fileToUpload"]['tmp_name'];
$s3->putObject(
array(
'Bucket'=>$bucketName,
'Key' => $keyName,
'SourceFile' => $file,
'StorageClass' => 'REDUCED_REDUNDANCY'
)
);
} catch (S3Exception $e) {
die('Error:' . $e->getMessage());
} catch (Exception $e) {
die('Error:' . $e->getMessage());
}
//$feedback = 'File uploaded! Custom name: ' . '<b><i>' . $newfilename;
$_SESSION['newfilename'] = $newfilename;
header("Location: next.php");
}
?>
Try to increase the POST_MAX_SIZE and UPLOAD_MAX_FILESIZE values in your php.ini!
$file = $_FILES["fileToUpload"]['tmp_name'];
change to
$file = $_FILES["fileToUpload"]['name'];
Hope it will solve your problem.

PHP Script not proceeding ahead with Amazon S3 Upload

I am trying to upload a file to the Amazon S3 using the AWS PHP SDK but whenever I try uploading, the script gives output till the PutObject operation but no output after that operation. I am not even able to dump the result object. The credentials are stored in the .aws in the root folder of my test machine (running Ubuntu). Below is the code -
$s3 = new Aws\S3\S3Client([
'version' => 'latest',
'region' => 'us-east-1'
]);
echo "<br />".realpath(UPLOAD_DIR . $apiKEY . "/" . $name);
try {
$result = $s3->putObject([
'Bucket' => 'quicklyusercannedspeechbucket',
'ContentType' => 'text/plain',
'Key' => $apiKEY . "/" . $name,
'SourceFile' => realpath(UPLOAD_DIR . $apiKEY . "/" . $name)
]);
var_dump($result);
}
catch(\Aws\S3\Exception\S3Exception $e) {
echo $e->getAwsErrorCode();
}
echo "Finished";
var_dump($result);
When I run the above code, I don't get any output for the $result array. Any idea what might be happening?
Any help is appreciated ^_^
-Pranav
Use below code to view if your image is uploaded successfully
try {
$result = $s3->putObject([
'Bucket' => 'quicklyusercannedspeechbucket',
'ContentType' => 'text/plain',
'Key' => $apiKEY . "/" . $name,
'SourceFile' => realpath(UPLOAD_DIR . $apiKEY . "/" . $name)
]);
$s3file='http://quicklyusercannedspeechbucket.s3.amazonaws.com/'.$name;
echo "<img src='$s3file'/>";
echo 'S3 File URL:'.$s3file;
var_dump($result);
}
catch(\Aws\S3\Exception\S3Exception $e) {
echo $e->getAwsErrorCode();
}
You can get step by step detail from here Amazon S3 File Upload Using PHP
It seems that I had put the credentials in the wrong folder. As per the NGINX, the default folder for storing the credentials was /var/www rather than the home directory of the user.
Also, I turned on the display_errors in the php.ini which helped me find that the AWS SDK was having problems with the credentials.
Thanks!

Categories