PHP Script not proceeding ahead with Amazon S3 Upload - php

I am trying to upload a file to the Amazon S3 using the AWS PHP SDK but whenever I try uploading, the script gives output till the PutObject operation but no output after that operation. I am not even able to dump the result object. The credentials are stored in the .aws in the root folder of my test machine (running Ubuntu). Below is the code -
$s3 = new Aws\S3\S3Client([
'version' => 'latest',
'region' => 'us-east-1'
]);
echo "<br />".realpath(UPLOAD_DIR . $apiKEY . "/" . $name);
try {
$result = $s3->putObject([
'Bucket' => 'quicklyusercannedspeechbucket',
'ContentType' => 'text/plain',
'Key' => $apiKEY . "/" . $name,
'SourceFile' => realpath(UPLOAD_DIR . $apiKEY . "/" . $name)
]);
var_dump($result);
}
catch(\Aws\S3\Exception\S3Exception $e) {
echo $e->getAwsErrorCode();
}
echo "Finished";
var_dump($result);
When I run the above code, I don't get any output for the $result array. Any idea what might be happening?
Any help is appreciated ^_^
-Pranav

Use below code to view if your image is uploaded successfully
try {
$result = $s3->putObject([
'Bucket' => 'quicklyusercannedspeechbucket',
'ContentType' => 'text/plain',
'Key' => $apiKEY . "/" . $name,
'SourceFile' => realpath(UPLOAD_DIR . $apiKEY . "/" . $name)
]);
$s3file='http://quicklyusercannedspeechbucket.s3.amazonaws.com/'.$name;
echo "<img src='$s3file'/>";
echo 'S3 File URL:'.$s3file;
var_dump($result);
}
catch(\Aws\S3\Exception\S3Exception $e) {
echo $e->getAwsErrorCode();
}
You can get step by step detail from here Amazon S3 File Upload Using PHP

It seems that I had put the credentials in the wrong folder. As per the NGINX, the default folder for storing the credentials was /var/www rather than the home directory of the user.
Also, I turned on the display_errors in the php.ini which helped me find that the AWS SDK was having problems with the credentials.
Thanks!

Related

Amazon S3 working in production but not from localhost

I cannot upload to my s3 bucket from localhost despite it working fine in production. No errors are caught during upload but the files never appear (neither pdf nor images). I have no blocking of public access (which appears to be the issue for most people).
I saw one person on here solve their problem by switching on allow_url_fopen in their php settings. Mine was already on.
EDIT: I should note that I have a python script which I run from Windows which uses the bucket without any problem. It makes me think it has something to do with my WAMP server settings but neither my php nor apache logs show anything and the settings look fine.
Any other ideas?
require_once '../../includes/aws/aws-autoloader.php';
$msg = "";
if( isset($_POST['fileUpload']) ) {
$fileUpload = $_POST['fileUpload'];
$dirPath = "drives/nota_simple/projects/";
$pdf_parts = explode(";base64,", $fileUpload["pdf"]);
$pdf_base64 = $pdf_parts[1];
$s3 = new Aws\S3\S3Client([
'region' => 'eu-west-1',
'version' => '2006-03-01',
'credentials' => [
'key' => "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX",
'secret' => "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX",
]
]);
$projectFolder = utf8_decode(filter_var($fileUpload['fileProject'], FILTER_SANITIZE_STRING));
$dirPath .= $projectFolder;
$dirPath .= '/' . $fileUpload['fileIdTask'];
$id_image = [taken from DB]
$key = $dirPath .'/'.$id_image.'.'.$fileUpload['fileExt'];
try { // Guardamos nota_simple
$result = $s3->putObject([
'Bucket' => 'BUCKET_NAME',
'Key' => $key,
'Body' => base64_decode($pdf_base64),
'ContentType' => 'application/' . $fileUpload['fileExt'],
'ACL' => 'public-read'
]);
}
catch(S3Exception $e) {
echo $e->getMessage() . "\n";
$msg = "Fallo al subir nota_simple: " . $e->getMessage();
$response = array("error" => $msg);
}
$response = array("success" => "Guardado con suceso.");
The issue was within my localhost server settings:
AWS HTTP error: cURL error 77: error setting certificate verify locations: CAfile: C:/wamp64/bin/php7.0.33/extras/ssl/cacert.pem
This was not the correct path.
I added the correct path to php.ini (as seen below) and restarted the server. Now it works!
curl.cainfo="c:/wamp64/bin/php/php7.0.33/extras/ssl/cacert.pem"
openssl.cafile="c:/wamp64/bin/php/php7.0.33/extras/ssl/cacert.pem"

SSL Certification issue cURL error 60 and Error executing "PutObject"

I am trying to send a .txt file to an s3bucket in AWS but whenever I run my code using mamp I get the following error:
Error:Error executing "PutObject" on "https://dynamics-bucket-qa.s3.eu-west-2.amazonaws.com/folder/Test.txt"; AWS HTTP error: cURL error 60: SSL certificate problem: unable to get local issuer certificate (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)
I have placed cacert.pem file which I downloaded from the internet in the extra/ssl folder in MAMP and I have also updated the php.ini file with this curl.cainfo = "C:\MAMP\bin\php\php7.2.10\extras\ssl\cacert.pem"
After all this, it still does not work. I've been trying for a long time but haven't managed to fix it. I'm pretty new to all this.
I'm using PHP version 7.2.10
<?php
// Run:$ composer require aws/aws-sdk-php
require '../vendor/autoload.php';
use Aws\AwsClient;
use Aws\Exception\AwsException;
// AWS Info
$bucketName = 'dynamics-bucket-qa';
$IAM_KEY = 'XXXXXXXXXXXXXXXXXXXXXXX';
$IAM_SECRET = 'XXXXXXXXXXXXXXXX';
// Connect to AWS
try {
// You may need to change the region. It will say in the URL when the bucket is open
// and on creation.
$s3 = Aws\S3\S3Client::factory(
array(
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET
),
'version' => 'latest',
'region' => 'eu-west-2'
)
);
} catch (Exception $e) {
die("Error: " . $e->getMessage());
}
// For this, I would generate a unqiue random string for the key name. But you can do whatever.
$keyName = 'folder/' . basename($_FILES["fileToUpload"]['name']);
$pathInS3 = 'https://s3.eu-west-2.amazonaws.com/' . $bucketName . '/' . $keyName;
// Add it to S3
try {
// Uploaded:
$file = $_FILES["fileToUpload"]['tmp_name'];
$s3->putObject(
array(
'Bucket'=>$bucketName,
'Key' => $keyName,
'SourceFile' => $file,
'StorageClass' => 'REDUCED_REDUNDANCY'
)
);
} catch (S3Exception $e) {
die('Error:' . $e->getMessage());
} catch (Exception $e) {
die('Error:' . $e->getMessage());
}
echo 'Done';
//htdocs composer -v
?>

How to check if 2 images with different file names are identical with PHP

I am getting images from the WEB and sending them to an S3 bucket on AWS. Each image gets a random name. But I don't want to store identical twins. So, before storing the image on my server (to then sending it to S3), I need to check if an identical image already exists there (on S3).
I cannot do the check on my server, because files will be deleted after successfully sent to S3.
Here is my code:
$filenameIn = $url_img;
$nome = randomString();
$nome_img = "$nome.png";
$filenameOut = __DIR__ . '/img/' . $nome_img;
$contentOrFalseOnFailure = file_get_contents($filenameIn);
$bucket = 'bucket-img';
if( !$contentOrFalseOnFailure ) {
return "error1: ...";
}
$byteCountOrFalseOnFailure = file_put_contents($filenameOut, $contentOrFalseOnFailure);
if( !$byteCountOrFalseOnFailure ) {
return "error2: ...";
}
// Instantiate the S3 client with your AWS credentials
$aws = new Aws\S3\S3Client([
'version' => 'latest',
'region' => 'sa-east-1',
'credentials' => array(
'key' => 'omissis',
'secret' => 'omissis',
)
]);
/*
I BELIEVE AT THIS POINT I SHOULD MAKE A COMPARISON TO CHECK IF THE FILE THAT
IS ABOUT TO BE SENT TO S3 HAS AN IDENTICAL TWIN THERE, REGARDLESS OF THEIR
RANDOM NAME. IF SO, UPLOAD SHOULD BE ABORTED AND THE SCRIPT SHOULD PROVIDE
ME WITH THE URL OF THE TWIN.
*/
try{
// Send a PutObject request and get the result object.
$result = $aws->putObject([
'Bucket' => $bucket,
'Key' => $nome_img,
'SourceFile' => $filenameOut,
'ACL' => 'public-read'
]);
$r = $aws->getObjectUrl($bucket, $nome_img);
return $r;
} catch (S3Exception $e) {
return $e->getMessage() . "\n";
}
As I said in the code, I should get the url of the twin file, if it exists.
Is that possible?

Laravel and AWS PHP SDK - Unable to delete a local file after it was uploaded to S3

I am trying to delete a file from a local directory right after I have uploaded it to AWS S3. When I run it on Vagrant I get an error = Text-file:busy, and when I run it on xampp I get the error :permission denied. For some reason the AWS S3 PutObject method is not releasing the file handle. I have tried to unset the s3 object but that didn't work.
Here is the code:
$tempName = public_path().'/path/to/file'
//Initialize AWS
$s3 = AWS::createClient('s3');
//Upload image to AWS
try {
$reponse = $s3->putObject(array(
'Bucket' => 'zotamoda',
'Key' => $productImage->image_folder."/".$productImage->image_name,
'SourceFile' => $tempName,
'ACL' => 'public-read',
));
} catch (S3Exception $e) {
// The AWS error code (e.g., )
echo $e->getAwsErrorCode() . "\n";
// The bucket couldn't be created
echo $e->getMessage() . "\n";
}
//Delete image from temporary location
unlink($tempName);
You could try:
Storage::disk('s3')->put($productImage->image_folder."/".$productImage->image_name, file_get_contents($tempName), 'public');
unlink($tempName);
or, assuming that $tempName is relative to your project root:
Storage::disk('local')->delete($tempName)
I think you should try calling:
gc_collect_cycles();
before deleting the file

Using PHP to upload to Amazon S3

I've spent the last few hours following tutorials for implementing file uploads to Amazon S3 using php. I uploaded the most recent version of Donovan Schönknecht's S3 class to my server (as S3.php) and I am trying to use the following code to test upload capability. I know this code will work because I've seen numerous examples in action.
<?php
require('S3.php');
$s3 = new S3('KEY', 'SECRET KEY');
//insert into s3
$new_name = time() . '.txt';
S3::putObject(
'upload-me.txt',
'bucketName',
$new_name,
S3::ACL_PUBLIC_READ,
array(),
array(),
S3::STORAGE_CLASS_RRS
);
?>
I get an error 500 server error when I attempt to load this page. Additionally, every other reputable tutorial of this nature has given me the same error 500.
I verified that my key and secret key are valid by connecting to S3 with Cyberduck.
Does anyone have a clue as to what I could be doing incorrectly?
Thanks,
Sean
As it turns out, I was missing the cURL extension for PHP and this was causing an issue as the S3 class I was using required the use of cURL. All is working now.
You should also consider using the official AWS SDK for PHP. Examples for using S3 with the SDK can be found in their S3 user guide.
You can download most recent version of Amazon PHP SDK by running following composer command
composer require aws/aws-sdk-php
Further configuration to upload file on Amazon s3 are following
// Include the SDK using the Composer autoloader
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
// Set Amazon s3 credentials
$client = S3Client::factory(
array(
'key' => "your-key",
'secret' => "your secret key"
)
);
try {
$client->putObject(array(
'Bucket'=>'your-bucket-name',
'Key' => 'your-filepath-in-bucket',
'SourceFile' => 'source-filename-with-path',
'StorageClass' => 'REDUCED_REDUNDANCY'
));
} catch (S3Exception $e) {
// Catch an S3 specific exception.
echo $e->getMessage();
}
Get step by step details from here Amazon S3 File Upload Using PHP
Following example worked for me:
<?php
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$client = S3Client::factory([
'version' => 'latest',
'region' => 'us-west-1',
'credentials' => [
'key' => "<scret-key>",
'secret' => "<my-secret>"
]
]);
try {
$client->putObject([
'Bucket' =>'<my-bucket-name>',
'Key' => '<file-name>',
'SourceFile' => '<file-path-on-server>', // like /var/www/vhosts/mysite/file.csv
'ACL' => 'public-read',
]);
} catch (S3Exception $e) {
// Catch an S3 specific exception.
echo $e->getMessage();
}
Getting security credentials:
https://aws.amazon.com/blogs/security/wheres-my-secret-access-key/
https://console.aws.amazon.com/iam/home?#/security_credential
Getting region code
https://docs.aws.amazon.com/general/latest/gr/rande.html
Use this one to Upload Images using a form and it's working Fine for me
you may try using it with your code
$name = $_FILES['photo']['name'];
$size = $_FILES['photo']['size'];
$tmp = $_FILES['photo']['tmp_name'];
//////Upload Process
// Bucket Name
$bucket = 'bucket-name';
require_once('S3.php');
//AWS access info
$awsAccessKey = 'awsAccessKey';
$awsSecretKey = 'awsSecretKey';
//instantiate the class
$s3 = new S3($awsAccessKey, $awsSecretKey);
$s3->putBucket($bucket, S3::ACL_PUBLIC_READ);
//Rename image name.
$actual_image_name = time();
//Upload to S3
if($s3->putObjectFile($tmp, $bucket , $actual_image_name, S3::ACL_PUBLIC_READ) )
{
$image='http://'.$bucket.'.s3.amazonaws.com/'.$actual_image_name;
}else{
echo 'error uploading to S3 Amazon';
}
I never found a updated Script with Amazons latest sdk. i have made it by myself. it woks as a php commandline interpreter script. give it a try :
https://github.com/arizawan/aiss3clientphp
I'm not familiar with S3 API, but i used it as the storage with https://github.com/KnpLabs/Gaufrette. Gaufrette is a library that provides pretty nice abstraction layer above S3 and other file services/systems.
Here is sample code to upload images on Amazon S3.
// Bucket Name
$bucket="BucketName";
if (!class_exists('S3'))require_once('S3.php');
//AWS access info
if (!defined('awsAccessKey')) define('awsAccessKey', 'ACCESS_KEY');
if (!defined('awsSecretKey')) define('awsSecretKey', 'ACCESS_Secret_KEY');
$s3 = new S3(awsAccessKey, awsSecretKey);
$s3->putBucket($bucket, S3::ACL_PUBLIC_READ);
if($s3->putObjectFile($tmp, $bucket , $image_name_actual,S3::ACL_PUBLIC_READ) )
{
$message = "S3 Upload Successful.";
$s3file='http://'.$bucket.'.s3.amazonaws.com/'.$actual_image_name;
echo "<img src='$s3file'/>";
echo 'S3 File URL:'.$s3file;
}
else{
$message = "S3 Upload Fail.";
}
}
Below is the best solution. Its using multipart upload.Make Sure to install Aws SDK for PHP before using
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\Exception\AwsException;
use Aws\S3\MultipartUploader;
use Aws\Exception\MultipartUploadException;
try {
$s3Client = new S3Client([
'version' => 'latest',
'region' => 'us-east-1',
'credentials' => [
'key' => 'AKIAUMIZJR5U5IO7M3',
'secret' => 'BmcA3vFso1bc/9GVK7nHJtFk0tQL6Vi5OoMySO',
],
]);
// Use multipart upload
$source = 'https://b8q9h6y2.stackpathcdn.com/wp-content/uploads/2016/08/banner-for-website-4.png';
$uploader = new MultipartUploader($s3Client, $source, [
'bucket' => 'videofilessandeep',
'key' => 'my-file.png',
'ACL' => 'public-read',
]);
try {
$result = $uploader->upload();
echo "Upload complete: {$result['ObjectURL']}\n";
} catch (MultipartUploadException $e) {
echo $e->getMessage() . "\n";
}
} catch (Exception $e) {
print_r($e);
}

Categories