I can't work out how to make this upload as 'reduced redundancy'
Iv added it in there twice, but it doesn't do anything. I think the way I have applied is useless.
I think i need to use this line but it seems i need to rebuild this?
setOption('StorageClass', 'REDUCED_REDUNDANCY')
require_once __DIR__ .'/vendor/autoload.php';
$options = [
'region' => $region,
'credentials' => [
'key' => $accessKeyId,
'secret' => $secretKey
],
'version' => '2006-03-01',
'signature_version' => 'v4',
'StorageClass' => 'REDUCED_REDUNDANCY',
];
$s3Client = new \Aws\S3\S3Client($options);
$uploader = new \Aws\S3\MultipartUploader($s3Client, $filename_dir , [
'bucket' => $bucket,
'key' => $filename,
'StorageClass' => 'REDUCED_REDUNDANCY',
]);
try {
$result = $uploader->upload();
echo "Upload complete: {$result['ObjectURL']}\n";
} catch (\Aws\Exception\MultipartUploadException $e) {
echo $e->getMessage() . "\n";
}
Reduced Redundancy Storage used to be about 20% lower cost, in exchange for only storing 2 copies of the data instead of 3 copies (1 redundant copy instead of 2 redundant copies).
However, with the December 2016 pricing changes to Amazon S3, it is no longer beneficial to use Reduced Redundancy Storage.
Using pricing from US Regions:
Reduced Redundancy Storage = 2.4c/GB
Standard storage = 2.3c/GB
Standard-Infrequent Access storage = 1.25c/GB + 1c/GB retrievals
Therefore, RRS is now more expensive than Standard storage. It is now cheaper to choose Standard or Standard-Infrequent Access.
Setting "StorageClass" like this won't work.
$s3Client = new \Aws\S3\S3Client($options);
Because the StorageClass is only set when the object is uploaded, you can not default all of your requests to a specific configuration during the initialization of the SDK. Each individual PUT request must have it's own options specified for it.
To use the "SetOption" line you mentioned, you may need to update your code to follow the following example found in the AWS PHP SDK Documentation.
Using the AWS PHP SDK for Multipart Upload (High-Level API) Documentation
The following PHP code sample demonstrates how to upload a file using the high-level UploadBuilder object.
<?php
// Include the AWS SDK using the Composer autoloader.
require 'vendor/autoload.php';
use Aws\Common\Exception\MultipartUploadException;
use Aws\S3\Model\MultipartUpload\UploadBuilder;
use Aws\S3\S3Client;
$bucket = '*** Your Bucket Name ***';
$keyname = '*** Your Object Key ***';
// Instantiate the client.
$s3 = S3Client::factory();
// Prepare the upload parameters.
$uploader = UploadBuilder::newInstance()
->setClient($s3)
->setSource('/path/to/large/file.mov')
->setBucket($bucket)
->setKey($keyname)
->setMinPartSize(25 * 1024 * 1024)
->setOption('Metadata', array(
'param1' => 'value1',
'param2' => 'value2'
))
->setOption('ACL', 'public-read')
->setConcurrency(3)
->build();
// Perform the upload. Abort the upload if something goes wrong.
try {
$uploader->upload();
echo "Upload complete.\n";
} catch (MultipartUploadException $e) {
$uploader->abort();
echo "Upload failed.\n";
echo $e->getMessage() . "\n";
}
So in this case you need to add 'StorageClass' as follows, the position isn't important, only the usage of setOption to set it:
->setOption('ACL', 'public-read')
->setOption('StorageClass', 'REDUCED_REDUNDANCY')
->setConcurrency(3)
->build();
Related
I tried to download file from amazon s3 to local storage. Download success, file appear in local storage but the file is empty no content. Looks like missed something in the Code. Need your help friends. Thanks in Advance
here's the code :
<?php
namespace App\Console\Library;
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
use Storage;
class DownloadAWS
{
public function downloadFile(){
$s3_file = Storage::cloud()->url('Something.jsonl');
$s3 = Storage::disk('local')->put('Order.jsonl', $s3_file);
}
}
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = '*** Your Bucket Name ***';
$keyname = '*** Your Object Key ***';
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1'
]);
try {
// Get the object.
$result = $s3->getObject([
'Bucket' => $bucket,
'Key' => $keyname
]);
// Display the object in the browser.
header("Content-Type: {$result['ContentType']}");
echo $result['Body'];
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
Currently, you are retrieving an URL to the s3 file and you are putting it in a file. Your code should currently create a file Order.jsonl containing the link to the s3 file.
What you really seem to want is getting the file and storing it locally. You can achieve this with the following code:
public function downloadFile()
{
$s3_file = Storage::cloud()->get('Something.jsonl');
$s3 = Storage::disk('local')->put('Order.jsonl', $s3_file);
}
The only difference is using get() vs. url().
I have a web application in PHP up and running. I want this app capable of uploading images to AWS s3 bucket. I am checking the documentation at AWS, but found at least three different documentations for this purpose. But still I am not clear, is possible that my web app hosted with a different hosting service will be able to upload files to AWS ?
If yes, which is the best option ?
You should be able to upload from outside of the AWS network.
Use the AWS PHP SDK at https://aws.amazon.com/sdk-for-php/
Then use the following code:
<?php
require 'vendor/autoload.php';
use Aws\Common\Exception\MultipartUploadException;
use Aws\S3\MultipartUploader;
use Aws\S3\S3Client;
$bucket = '*** Your Bucket Name ***';
$keyname = '*** Your Object Key ***';
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1'
]);
// Prepare the upload parameters.
$uploader = new MultipartUploader($s3, '/path/to/large/file.zip', [
'bucket' => $bucket,
'key' => $keyname
]);
// Perform the upload.
try {
$result = $uploader->upload();
echo "Upload complete: {$result['ObjectURL']}" . PHP_EOL;
} catch (MultipartUploadException $e) {
echo $e->getMessage() . PHP_EOL;
}
?>
Edit the bucket name, keyname, region and upload file name.
This is the multi-part upload style so you can upload huge files.
I'm trying desperately to figure out how to create a simple audio transcription script (for longer audio files) via PHP (the only language I know). I'm getting the error Class 'Google\Cloud\Storage\StorageClient' not found
I'm using the gcloud console code editor and everything should be installed (unless there is a separate composer install just for cloud storage, although I haven't been able to find anything about it in the documentation if there is).
I also entered gcloud auth application-default print-access-token which printed out an access token, but I don't know what (if any) I'm supposed to do with that other than the "set GOOGLE_APPLICATION_CREDENTIALS" command that I copied and pasted into the console shell prompt.
Here's the php code:
<?php
namespace Google\Cloud\Samples\Speech;
require __DIR__ . '/vendor/autoload.php';
use Exception;
# [START speech_transcribe_async_gcs]
use Google\Cloud\Speech\SpeechClient;
use Google\Cloud\Storage\StorageClient;
use Google\Cloud\Core\ExponentialBackoff;
$projectId = 'xxxx';
$speech = new SpeechClient([
'projectId' => $projectId,
'languageCode' => 'en-US',
]);
$filename = "20180925_184741_L.mp3";
# The audio file's encoding and sample rate
$options = [
'encoding' => 'LINEAR16',
'sampleRateHertz' => 16000,
'languageCode' => 'en-US',
'enableWordTimeOffsets' => false,
'enableAutomaticPunctuation' => true,
'model' => 'video',
];
function transcribe_async_gcs($bucketName, $objectName, $languageCode = 'en-US', $options = [])
{
// Create the speech client
$speech = new SpeechClient([
'languageCode' => $languageCode,
]);
// Fetch the storage object
$storage = new StorageClient();
$object = $storage->bucket($bucketName)->object($objectName);
// Create the asyncronous recognize operation
$operation = $speech->beginRecognizeOperation(
$object,
$options
);
// Wait for the operation to complete
$backoff = new ExponentialBackoff(10);
$backoff->execute(function () use ($operation) {
print('Waiting for operation to complete' . PHP_EOL);
$operation->reload();
if (!$operation->isComplete()) {
throw new Exception('Job has not yet completed', 500);
}
});
// Print the results
if ($operation->isComplete()) {
$results = $operation->results();
foreach ($results as $result) {
$alternative = $result->alternatives()[0];
printf('Transcript: %s' . PHP_EOL, $alternative['transcript']);
printf('Confidence: %s' . PHP_EOL, $alternative['confidence']);
}
}
}
# [END speech_transcribe_async_gcs]
transcribe_async_gcs("session_audio", $filename, "en-US", $options);
With apologies, PHP is not a language I'm proficient with but, I suspect you haven't (and must) install the client library for Cloud Storage so that your code may access it. This would explain its report that the Class is missing.
The PHP client library page includes two alternatives. One applies if you're using Composer, the second -- possibly what you want -- a direct download which you'll need to path correctly for your code.
Some time ago, I wrote a short blog post providing a simple example (using Cloud Storage) for each of Google's supported languages. Perhaps it will help you too.
Can I upload mpdf file to s3 server after generating.
$file_name = $pdf->Output(time().'_'.'E-Prescription.pdf','F');
Assuming you have the AWS SDK installed in your project using composer; specifically...
composer require aws/aws-sdk-php
Yes you can, using the stream wrapper like this:
require "vendor/autoload.php";
$aws_file = 's3://bucketname/foldername/your_file_name.pdf';
//the folder is optional if you have one within your bucket
try {
$s3->registerStreamWrapper();
$mpdf->Output($aws_file, \Mpdf\Output\Destination::FILE);
}
catch (S3Exception $e) {
$data['error'] = $e->getMessage();
//show the error as a JSON callback that you can use for troubleshooting
echo json_encode($data);
exit();
}
You might have to add write permissions to your web server as follows (using Apache server on Ubuntu AWS EC2):
sudo chown -R www-data /var/www/html/vendor/mpdf/mpdf/src/Config/tmp
sudo chmod -R 755 /var/www/html/vendor/mpdf/mpdf/src/Config/tmp
Then edit the ConfigVariables.php file found at:
\vendor\mpdf\mpdf\src\Config
Change:
'tempDir' => __DIR__ . '/../../tmp',
To:
'tempDir' => __DIR__ . '/tmp',
Then create an empty folder named 'tmp' in that same directory. Then upload with joy.
// Set yours config's
define("AWS_S3_KEY", "<your_key_here>");
define("AWS_S3_SECRET", "<your_secret_here>");
define("AWS_S3_REGION", "<your_region_here example:us-east-1>");
define("AWS_S3_BUCKET", "<your_bucket_folder_name_here>");
try {
/*
doc: https://github.com/mpdf/mpdf
url/download: https://github.com/mpdf/mpdf/archive/development.zip
*/
require_once 'mpdf/mpdf.php'; // load yout mdf libe
$mpdf = new mPDF(); // set init object mPDF
$nomeArquivo = md5('cliente_01'); // set file name and cripty this
$mpdf->WriteHTML("Teste upload PDF in s3 bucket");
/*
doc: https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/getting-started_installation.html
url/download: https://docs.aws.amazon.com/aws-sdk-php/v3/download/aws.zip
*/
require_once 'aws/aws-autoloader.php'; // set locate yout lib AWS
$aws_file = 's3://'.AWS_S3_BUCKET.'/'.$nomeArquivo.'.pdf';
$s3 = new Aws\S3\S3Client([
'region' => AWS_S3_REGION,
'version' => 'latest',
'credentials' => [
'key' => AWS_S3_KEY,
'secret' => AWS_S3_SECRET,
]
]);
$s3->registerStreamWrapper();
$mpdf->Output($aws_file); //Send yout mPDF File in s3-file-bucket
} catch (S3Exception $e) {
die($e->getError().' => '.$e->getMessage();
}
To do this you could use the AWS SDK for PHP.
First you will need to create a client using your profile credentials.
use Aws\S3\S3Client;
$client = S3Client::factory(array(
'credentials' => array(
'key' => 'YOUR_AWS_ACCESS_KEY_ID',
'secret' => 'YOUR_AWS_SECRET_ACCESS_KEY',
)
));
And, if the bucket already exists, you can upload your file from the file system like this:
$result = $client->putObject(array(
'Bucket' => $bucket,
'Key' => $file_name,
'SourceFile' => $pathToFile
));
I have been tasked with connecting to an s3 bucket and using documentation have the following:
<?php
define('AWS_KEY', 'key in here');
define('AWS_SECRET_KEY', 'key in here');
define('HOST', 'https://console.aws.amazon.com/s3/home?region=us-east-1#');
use Aws\S3\S3Client;
// Establish connection with DreamObjects with an S3 client.
$client = S3Client::factory(array(
'base_url' => HOST,
'key' => AWS_KEY,
'secret' => AWS_SECRET_KEY
));
// list owned buckets
$blist = $client->listBuckets();
echo " Buckets belonging to " . $blist['Owner']['ID'] . ":\n";
foreach ($blist['Buckets'] as $b) {
echo "{$b['Name']}\t{$b['CreationDate']}\n";
}
// list Bucket contents
$o_iter = $client->getIterator('ListObjects', array(
'Bucket' => $bucketname
));
foreach ($o_iter as $o) {
echo "{$o['Key']}\t{$o['Size']}\t{$o['LastModified']}\n";
}
but I get the error in the title any ideas I have my access keys but i am confused about how to fix this issue ?
It is probably a bad idea to hardcode or to use environment variable to pass your secret key and access key.
A better design pattern would be to leverage EC2 Role or to use the SDK configuration file (see http://docs.aws.amazon.com/aws-sdk-php/guide/latest/credentials.html for details)
The base_url argument you're using is invalid, it is the URL of the console, not the one from the service. You can just ignore this parameter (as per http://docs.aws.amazon.com/aws-sdk-php/guide/latest/configuration.html#client-configuration-options), the SDK will build it automatically for you