AWS S3 displaying a file on the client browser - php

I am trying to use AWS to store all files from my app there. I was successfully able to upload them to the bucket, but I am not able to retrieve the object to show it to the user. I am getting an error "You cannot call GetObject on the service resource." I am not sure what is wrong? I thought it was permission issues? but if so how come I can upload the file?
Here is what I have
function aws_file_upload($key,$file)
{
$aws = aws();
// Get a resource representing the S3 service.
$s3 = $aws->s3;
$bucket = $s3->bucket('my-bucket-name');
try{
$result = $object = $bucket->putObject([
'Key' => $key,
'Body' => fopen($file, 'r'),
]);
$status = 'OK';
}catch(Exception $e){
$status = 'error';
}
return $status;
}
function aws_file_get($key)
{
$aws = aws();
$s3 = $aws->s3;
$result = $s3->getObject([
'Bucket' => 'my-bucket-name',
'Key' => $key
]);
// Display the object in the browser.
header("Content-Type: {$result['ContentType']}");
echo $result['Body'];
}
$key = 'Cases/my-file.pdf';
$file_name = 'my-file.pdf';
$res = aws_file_upload($key,$file_name);/// this will put the file into the AWS bucket
$result = aws_file_get($key);/// this will through the error

You need to create an s3 client.
Here is some sample code from Creating and Using Amazon S3 Buckets with the AWS SDK for PHP Version 3 - AWS SDK for PHP:
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\Exception\AwsException;
$s3Client = new S3Client([
'profile' => 'default',
'region' => 'us-west-2',
'version' => '2006-03-01'
]);
$result = $s3Client->putObject([
'Bucket' => $bucket,
'Key' => $key,
'SourceFile' => $file_Path,
]);

For anyone wanting an answer here was my solution.
$aws = aws();
$s3 = $aws->s3;
$bucket = $s3->bucket('my-bucket-name');
$result = $bucket->object($key)->get();
header("Content-Type: {$result['ContentType']}");
echo $result['Body'];

Related

How can we generate a daily backup in digital ocean s3 storage space by using php script?

require 'vendor/autoload.php';
use Aws\S3\S3Client;
// use Aws\S3\Exception\S3Exception;
/***************** S3 Digital Ocean connection ***************************************/
// S3 connection
$client = new Aws\S3\S3Client([
'version' => 'latest',
'region' => 'us-east-1',
'endpoint' => '',
'credentials' => [
'key' => '',
'secret' => ''
],
]);
$date = date("dmy");
$new_bucketName = "s3-backup-new-".$date;
$prevdate = "070223";
$new_bucketName = "s3-backup-new-".$prevdate;
$spacesBucket = $client->listBuckets();
// Copying the files and folders
$livefolder = "s3";
$iterator = $client->getIterator('ListObjects', array('Bucket' => $livefolder));
foreach ($iterator as $obj) {
$response = $client->doesObjectExist($new_bucketName, $livefolder."/".$obj['Key']);
echo $response;
if($response!=1)
{
echo "new";
$client->copyObject([
'Bucket' => $new_bucketName,
'CopySource' => $livefolder."/".$obj['Key'],
'Key' => $obj['Key'],
]);
}
}
/****************** Delete Old Backup -5days ***********************/
$next_due_date = 's3-backup-new-'.date('dmy', strtotime("-5 days"));
// $next_due_date = 's3-backups';
$spaces = $client->listBuckets();
foreach ($spaces['Buckets'] as $space){
if($space['Name'] == $next_due_date)
{
$objects_delete = $client->getIterator('ListObjects', array('Bucket' => $next_due_date));
foreach ($objects_delete as $obj_delete)
{
$client->deleteObject([
'Bucket' => $next_due_date,
'Key' => $obj_delete['Key'],
]);
}
$client->deleteBucket([
'Bucket' => $next_due_date,
]);
}
}
echo "Copied successfully";
$response = array("Status"=>"Success", "Message"=>"Export successfully", "Data"=>"");
echo json_encode($response);
/****************** End Delete Old Backup -5days ***********************/
This is the code we have used to create S3 bucket and copy content to another folder and delete the folder.
But we have facing another issue, our original bucket contains more than 10TB of data . So we need to copy the whole data to another bucket. But it is not working. After one hour the copying is stopped , so it only copy 28gb only to another bucket. How we can copy the whole data to another bucket, with out stopping the process. Any one know please help.

How Do I Create a Dynamic Keyname Using Php AWS S3 API

Have been testing the latest Php AWS S3 API and I not sure how to dynamically name a keyname when a user uploads a new book, reason being I need to be able to retrieve the uploaded book by XYZ customer. Thanks in advance!
<?php
require 'aws/aws-autoloader.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
//AWS S3 SHITE BELOW
$bucket = 'acmebooks';
//CREATE DYNAMIC KEYNAME??
$keyname = 'RANDOM KEYNAME';
$secret = 'FOOBAR1345';
$credentials = new Aws\Credentials\Credentials($keyname, $secret);
$s3 = new Aws\S3\S3Client([
'version' => 'latest',
'region' => 'us-east-2',
'credentials' => $credentials
]);
try {
// Upload data.
$result = $s3->putObject([
'Bucket' => $bucket,
'Key' => $keyname,
//Body' => 'Hello, world!',
'Body' => 'https://booksrun.com/image-loader/350/https:__images-na.ssl-images-amazon.com_images_I_41sYJq3nAWL.jpg',
'ACL' => 'public-read'
]);
// Print the URL to the object.
echo $result['ObjectURL'] . PHP_EOL;
print_r($result['Body']);
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
?>
Was overlooking and confusing $keyname which is my unique AWS S3 "Key", for the "Key name" of what is PUT into the AWS S3 Bucket.
FIX BELOW
<?php
require 'aws/aws-autoloader.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
//TEST FOR AWS S3
$key_input = 'jaybookcover1';
//AWS S3 SHITE BELOW
$bucket = 'acmebooks';
//UNIQUE AWS S3 KEYNAME??
$keyname = ' my unique AWS S3 key';
$secret = 'FOOBAR1345';
$credentials = new Aws\Credentials\Credentials($keyname, $secret);
$s3 = new Aws\S3\S3Client([
'version' => 'latest',
'region' => 'us-east-2',
'credentials' => $credentials
]);
try {
// Upload data.
$result = $s3->putObject([
'Bucket' => $bucket,
'Key' => $key_input,
//Body' => 'Hello, world!',
'Body' => 'https://booksrun.com/image-loader/350/https:__images-na.ssl-images-amazon.com_images_I_41sYJq3nAWL.jpg',
'ACL' => 'public-read'
]);
// Print the URL to the object.
echo $result['ObjectURL'] . PHP_EOL;
print_r($result['Body']);
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
?>

AWS s3 Uploading files on localhost, but not working in the real URL

function setUploadAWS($ORIGEM, $DESTINO, $DIR) {
$BUCKET = 'some-bucket';
$ACCESS_KEY = "some-access-key";
$SECRET_KEY = "some-secret-key";
$credentials = new Aws\Credentials\Credentials($ACCESS_KEY, $SECRET_KEY);
try {
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1',
'credentials' => $credentials
]);
$result = $s3->putObject([
'Bucket' => $BUCKET,
'Key' => $DIR . $DESTINO,
'SourceFile' => $ORIGEM,
]);
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
The aws code for uploading pictures is working to upload files through localhost, but it's not able to upload files to the system.
The solution was to change parameters allow_url_fopen and allow_url_include to On:
allow_url_fopen= On
allow_url_include= On
The parameters are listed on php.ini.
Path = apache2/php.ini

Sending file to S3 from my server

Here is my code which works if I run it locally but not if I run on my server to transfer my csv to s3 using PHP from my server.
<?php
// Include the AWS SDK using the Composer autoloader.
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-2'
]);
$bucket = 'edtopia';
$keyname = 'sample_1.csv';
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'Body' => fopen('numberclub.org/edtopiadb/df_corr.csv', 'r'),
'ACL' => 'public-read'
));
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
EDIT- This now working for me on the server, Now my challenge is to access the file on the local system to be sent to s3 using this php script where in the BODY i need to define my local system file path and this code running on the server. What is the effective way to achieve it

Upload file on amazon S3 with PHP SDK

I'm trying to upload a picture on my amazon S3 via their PHP SDK. So I made a little script to do so. However, my script doesn't work and my exception doesn't send me back any error message.
I'm new with AWS thank you for your help.
Here is the code :
Config.php
<?php
return array(
'includes' => array('_aws'),
'services' => array(
'default_settings' => array(
'params' => array(
'key' => 'PUBLICKEY',
'secret' => 'PRIVATEKEY',
'region' => 'eu-west-1'
)
)
)
);
?>
Index.php
<?php
//Installing AWS SDK via phar
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'infact';
$keyname = 'myImage';
// $filepath should be absolute path to a file on disk
$filepath = 'image.jpg';
// Instantiate the client.
$s3 = S3Client::factory('config.php');
// Upload a file.
try {
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filePath,
'ContentType' => 'text/plain',
'ACL' => 'public-read',
'StorageClass' => 'REDUCED_REDUNDANCY'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
EDIT : I'm now using this code but its still not working. I don't even have error or exception message.
<?php
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'infactr';
$keyname = 'sample';
// $filepath should be absolute path to a file on disk
$filepath = 'image.jpg';
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'key',
'secret' => 'privatekey',
'region' => 'eu-west-1'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filePath,
'ACL' => 'public-read',
'ContentType' => 'image/jpeg'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
Try something like this (from the AWS docs):
<?php
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = '<your bucket name>';
$keyname = 'sample';
// $filepath should be absolute path to a file on disk
$filepath = '/path/to/image.jpg';
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'your AWS access key',
'secret' => 'your AWS secret access key'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ACL' => 'public-read'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
It works fine for me as long as you have the right credentials. Keep in mind that the key name is the name of your file in S3 so if you want to have your key have the same name of your file you have to do something like: $keyname = 'image.jpg'; . Also, a jpg is generally not a plain/text file type, you can ommit that Content-type field or you can just simply specify: image/jpeg
$s3 = S3Client::factory('config.php');
should be
$s3 = S3Client::factory(include 'config.php');
For those looking an up to date working version, this is what I am using
// Instantiate the client.
$s3 = S3Client::factory(array(
'credentials' => [
'key' => $s3Key,
'secret' => $s3Secret,
],
'region' => 'us-west-2',
'version' => "2006-03-01"
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $s3Bucket,
'Key' => $fileId,
'SourceFile' => $filepath."/".$fileName
));
return $result['ObjectURL'];
} catch (S3Exception $e) {
return false;
}
An alternative way to explain is by showing the curl, and how to build it in php - the pragmatic approach.
Please don't stone me for ugly code, just thought that this example is easy to follow for uploading to Azure from PHP, or other language.
$azure1 ='https://viperprodstorage1.blob.core.windows.net/bucketnameAtAzure/';
$azure3 ='?sr=c&si=bucketnameAtAzure-policy&sig=GJ_verySecretHashFromAzure_aw%3D';
$shellCmd='ls -la '.$outFileName;
$lsOutput=shell_exec($shellCmd);
#print_r($lsOutput);
$exploded=explode(' ', $lsOutput);
#print_r($exploded);
$fileLength=$exploded[7];
$curlAzure1="curl -v -X PUT -T '" . $outFileName . "' -H 'Content-Length: " . $fileLength . "' ";
$buildedCurlForUploading=$curlAzure1."'".$azure1.$outFileName.$azure3."'";
var_dump($buildedCurlForUploading);
shell_exec($buildedCurlForUploading);
This is the actual curl
shell_exec("curl -v -X PUT -T 'fileName' -H 'Content-Length: fileSize' 'https://viperprodstorage1.blob.core.windows.net/bucketnameAtAzure/fileName?sr=c&si=bucketNameAtAzure-policy&sig=GJ_verySecretHashFromAzure_aw%3D'")
Below are the code for upload image/file in amazon s3 bucket.
function upload_agreement_data($target_path, $source_path, $file_name, $content_type)
{
$fileup_flag = false;
/*------------- call global settings helper function starts ----------------*/
$bucketName = "pilateslogic";
//$global_setting_option = '__cloud_front_bucket__';
//$bucketName = get_global_settings($global_setting_option);
/*------------- call global settings helper function ends ----------------*/
if(!$bucketName)
{
die("ERROR: Template bucket name not found!");
}
// Amazon profile_template template js upload URL
$target_profile_template_js_url = "/".$bucketName."/".$target_path;
// Chatching profile_template template js upload URL
//$source_profile_template_js_url = dirname(dirname(dirname(__FILE__))).$source_path."/".$file_name;
// file name
$template_js_file = $file_name;
$this->s3->setEndpoint("s3-ap-southeast-2.amazonaws.com");
if($this->s3->putObjectFile($source_path, $target_profile_template_js_url, $template_js_file, S3::ACL_PUBLIC_READ, array(), array("Content-Type" => $content_type)))
{
$fileup_flag = true;
}
return $fileup_flag;
}

Categories