Download File from Amazon s3. Download succees but File Empty - php

I tried to download file from amazon s3 to local storage. Download success, file appear in local storage but the file is empty no content. Looks like missed something in the Code. Need your help friends. Thanks in Advance
here's the code :
<?php
namespace App\Console\Library;
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
use Storage;
class DownloadAWS
{
public function downloadFile(){
$s3_file = Storage::cloud()->url('Something.jsonl');
$s3 = Storage::disk('local')->put('Order.jsonl', $s3_file);
}
}

require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = '*** Your Bucket Name ***';
$keyname = '*** Your Object Key ***';
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1'
]);
try {
// Get the object.
$result = $s3->getObject([
'Bucket' => $bucket,
'Key' => $keyname
]);
// Display the object in the browser.
header("Content-Type: {$result['ContentType']}");
echo $result['Body'];
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;

Currently, you are retrieving an URL to the s3 file and you are putting it in a file. Your code should currently create a file Order.jsonl containing the link to the s3 file.
What you really seem to want is getting the file and storing it locally. You can achieve this with the following code:
public function downloadFile()
{
$s3_file = Storage::cloud()->get('Something.jsonl');
$s3 = Storage::disk('local')->put('Order.jsonl', $s3_file);
}
The only difference is using get() vs. url().

Related

Class S3 not found if use inside function

I am new in PHP. I am trying to make S3 upload function so I can use it in my other php files. I have made s3.php file like this
<?php
use Aws\S3\S3Client;
use Aws\Exception\AwsException;
use Aws\S3\ObjectUploader;
use Aws\S3\MultipartUploader;
use Aws\Exception\MultipartUploadException;
function bucket_upload($BUCKET_ACCESS_KEY,$BUCKET_SECRET_KEY,$BUCKET_REGION,$BUCKET_URL,$BUCKET_NAME,$FILE,$FILE_NAME){
require __DIR__.'/vendor/autoload.php';
$config = [
'credentials' => new \Aws\Credentials\Credentials(
$BUCKET_ACCESS_KEY,
$BUCKET_SECRET_KEY
),
'version' => 'latest',
'region' => $BUCKET_REGION,
'endpoint' => $BUCKET_URL
];
$s3Client = new \Aws\S3\S3Client($config);
if($s3Client->putObjectFile($FILE, $BUCKET_NAME, $FILE_NAME, S3::ACL_PUBLIC_READ)) {
return true;
}else{
return false;
}
}
Its working fine if I do not use S3 inside function. For now in function its giving me error on line 23 called class S3 not found. Let me know if anyone here can help me to fix the issue.
Thanks!

Upload image to AWS bucket from remote server

I have a web application in PHP up and running. I want this app capable of uploading images to AWS s3 bucket. I am checking the documentation at AWS, but found at least three different documentations for this purpose. But still I am not clear, is possible that my web app hosted with a different hosting service will be able to upload files to AWS ?
If yes, which is the best option ?
You should be able to upload from outside of the AWS network.
Use the AWS PHP SDK at https://aws.amazon.com/sdk-for-php/
Then use the following code:
<?php
require 'vendor/autoload.php';
use Aws\Common\Exception\MultipartUploadException;
use Aws\S3\MultipartUploader;
use Aws\S3\S3Client;
$bucket = '*** Your Bucket Name ***';
$keyname = '*** Your Object Key ***';
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1'
]);
// Prepare the upload parameters.
$uploader = new MultipartUploader($s3, '/path/to/large/file.zip', [
'bucket' => $bucket,
'key' => $keyname
]);
// Perform the upload.
try {
$result = $uploader->upload();
echo "Upload complete: {$result['ObjectURL']}" . PHP_EOL;
} catch (MultipartUploadException $e) {
echo $e->getMessage() . PHP_EOL;
}
?>
Edit the bucket name, keyname, region and upload file name.
This is the multi-part upload style so you can upload huge files.

How to upload mpdf file after generating to s3 bucket in php

Can I upload mpdf file to s3 server after generating.
$file_name = $pdf->Output(time().'_'.'E-Prescription.pdf','F');
Assuming you have the AWS SDK installed in your project using composer; specifically...
composer require aws/aws-sdk-php
Yes you can, using the stream wrapper like this:
require "vendor/autoload.php";
$aws_file = 's3://bucketname/foldername/your_file_name.pdf';
//the folder is optional if you have one within your bucket
try {
$s3->registerStreamWrapper();
$mpdf->Output($aws_file, \Mpdf\Output\Destination::FILE);
}
catch (S3Exception $e) {
$data['error'] = $e->getMessage();
//show the error as a JSON callback that you can use for troubleshooting
echo json_encode($data);
exit();
}
You might have to add write permissions to your web server as follows (using Apache server on Ubuntu AWS EC2):
sudo chown -R www-data /var/www/html/vendor/mpdf/mpdf/src/Config/tmp
sudo chmod -R 755 /var/www/html/vendor/mpdf/mpdf/src/Config/tmp
Then edit the ConfigVariables.php file found at:
\vendor\mpdf\mpdf\src\Config
Change:
'tempDir' => __DIR__ . '/../../tmp',
To:
'tempDir' => __DIR__ . '/tmp',
Then create an empty folder named 'tmp' in that same directory. Then upload with joy.
// Set yours config's
define("AWS_S3_KEY", "<your_key_here>");
define("AWS_S3_SECRET", "<your_secret_here>");
define("AWS_S3_REGION", "<your_region_here example:us-east-1>");
define("AWS_S3_BUCKET", "<your_bucket_folder_name_here>");
try {
/*
doc: https://github.com/mpdf/mpdf
url/download: https://github.com/mpdf/mpdf/archive/development.zip
*/
require_once 'mpdf/mpdf.php'; // load yout mdf libe
$mpdf = new mPDF(); // set init object mPDF
$nomeArquivo = md5('cliente_01'); // set file name and cripty this
$mpdf->WriteHTML("Teste upload PDF in s3 bucket");
/*
doc: https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/getting-started_installation.html
url/download: https://docs.aws.amazon.com/aws-sdk-php/v3/download/aws.zip
*/
require_once 'aws/aws-autoloader.php'; // set locate yout lib AWS
$aws_file = 's3://'.AWS_S3_BUCKET.'/'.$nomeArquivo.'.pdf';
$s3 = new Aws\S3\S3Client([
'region' => AWS_S3_REGION,
'version' => 'latest',
'credentials' => [
'key' => AWS_S3_KEY,
'secret' => AWS_S3_SECRET,
]
]);
$s3->registerStreamWrapper();
$mpdf->Output($aws_file); //Send yout mPDF File in s3-file-bucket
} catch (S3Exception $e) {
die($e->getError().' => '.$e->getMessage();
}
To do this you could use the AWS SDK for PHP.
First you will need to create a client using your profile credentials.
use Aws\S3\S3Client;
$client = S3Client::factory(array(
'credentials' => array(
'key' => 'YOUR_AWS_ACCESS_KEY_ID',
'secret' => 'YOUR_AWS_SECRET_ACCESS_KEY',
)
));
And, if the bucket already exists, you can upload your file from the file system like this:
$result = $client->putObject(array(
'Bucket' => $bucket,
'Key' => $file_name,
'SourceFile' => $pathToFile
));

Setting Storage class on Amazon s3 upload (ver 3)

I can't work out how to make this upload as 'reduced redundancy'
Iv added it in there twice, but it doesn't do anything. I think the way I have applied is useless.
I think i need to use this line but it seems i need to rebuild this?
setOption('StorageClass', 'REDUCED_REDUNDANCY')
require_once __DIR__ .'/vendor/autoload.php';
$options = [
'region' => $region,
'credentials' => [
'key' => $accessKeyId,
'secret' => $secretKey
],
'version' => '2006-03-01',
'signature_version' => 'v4',
'StorageClass' => 'REDUCED_REDUNDANCY',
];
$s3Client = new \Aws\S3\S3Client($options);
$uploader = new \Aws\S3\MultipartUploader($s3Client, $filename_dir , [
'bucket' => $bucket,
'key' => $filename,
'StorageClass' => 'REDUCED_REDUNDANCY',
]);
try {
$result = $uploader->upload();
echo "Upload complete: {$result['ObjectURL']}\n";
} catch (\Aws\Exception\MultipartUploadException $e) {
echo $e->getMessage() . "\n";
}
Reduced Redundancy Storage used to be about 20% lower cost, in exchange for only storing 2 copies of the data instead of 3 copies (1 redundant copy instead of 2 redundant copies).
However, with the December 2016 pricing changes to Amazon S3, it is no longer beneficial to use Reduced Redundancy Storage.
Using pricing from US Regions:
Reduced Redundancy Storage = 2.4c/GB
Standard storage = 2.3c/GB
Standard-Infrequent Access storage = 1.25c/GB + 1c/GB retrievals
Therefore, RRS is now more expensive than Standard storage. It is now cheaper to choose Standard or Standard-Infrequent Access.
Setting "StorageClass" like this won't work.
$s3Client = new \Aws\S3\S3Client($options);
Because the StorageClass is only set when the object is uploaded, you can not default all of your requests to a specific configuration during the initialization of the SDK. Each individual PUT request must have it's own options specified for it.
To use the "SetOption" line you mentioned, you may need to update your code to follow the following example found in the AWS PHP SDK Documentation.
Using the AWS PHP SDK for Multipart Upload (High-Level API) Documentation
The following PHP code sample demonstrates how to upload a file using the high-level UploadBuilder object.
<?php
// Include the AWS SDK using the Composer autoloader.
require 'vendor/autoload.php';
use Aws\Common\Exception\MultipartUploadException;
use Aws\S3\Model\MultipartUpload\UploadBuilder;
use Aws\S3\S3Client;
$bucket = '*** Your Bucket Name ***';
$keyname = '*** Your Object Key ***';
// Instantiate the client.
$s3 = S3Client::factory();
// Prepare the upload parameters.
$uploader = UploadBuilder::newInstance()
->setClient($s3)
->setSource('/path/to/large/file.mov')
->setBucket($bucket)
->setKey($keyname)
->setMinPartSize(25 * 1024 * 1024)
->setOption('Metadata', array(
'param1' => 'value1',
'param2' => 'value2'
))
->setOption('ACL', 'public-read')
->setConcurrency(3)
->build();
// Perform the upload. Abort the upload if something goes wrong.
try {
$uploader->upload();
echo "Upload complete.\n";
} catch (MultipartUploadException $e) {
$uploader->abort();
echo "Upload failed.\n";
echo $e->getMessage() . "\n";
}
So in this case you need to add 'StorageClass' as follows, the position isn't important, only the usage of setOption to set it:
->setOption('ACL', 'public-read')
->setOption('StorageClass', 'REDUCED_REDUNDANCY')
->setConcurrency(3)
->build();

Delete object or bucket in Amazon S3?

I created a new amazon bucket called "photos". The bucket url is something like:
www.amazons3.salcaiser.com/photos
Now I upload subfolders containing files, into that bucket for example
www.amazons3.salcaiser.com/photos/thumbs/file.jpg
My questions are, does thumbs/ is assumed a new bucket or is it an object?
Then if I want to delete the entire thumbs/ directory need I first to delete all files inside that or can I delete all in one time?
In the case you are describing, "photos" is the bucket. S3 does not have sub-buckets or directories. Directories are simulated by using slashes in the object key. "thumbs/file.jpg" is an object key and "thumbs/" would be considered a key prefix.
Dagon's examples are good and use the older version 1.x of the AWS SDK for PHP. However, you can do this more easily with the newest 2.4.x version AWS SDK for PHP which includes a helper method for deleting multiple objects.
<?php
// Include the SDK. This line depends on your installation method.
require 'aws.phar';
use Aws\S3\S3Client;
$s3 = S3Client::factory(array(
'key' => 'your-aws-access-key',
'secret' => 'your-aws-secret-key',
));
// Delete the objects in the "photos" bucket with the a prefix of "thumbs/"
$s3->deleteMatchingObjects('photos', 'thumbs/');
//Include s3.php file first in code
if (!class_exists('S3'))
require_once('S3.php');
//AWS access info
if (!defined('awsAccessKey'))
define('awsAccessKey', 'awsAccessKey');
if (!defined('awsSecretKey'))
define('awsSecretKey', 'awsSecretKey');
//instantiate the class
$s3 = new S3(awsAccessKey, awsSecretKey);
if ($s3->deleteObject("bucketname", `filename`)) {
echo 'deleted';
}
else
{
echo 'no file found';
}
found some code snippets for 'directory' deletion - i did not write them:
PHP 5.3+:
$s3 = new AmazonS3();
$bucket = 'your-bucket';
$folder = 'folder/sub-folder/';
$s3->get_object_list($bucket, array(
'prefix' => $folder
))->each(function($node, $i, $s3) {
$s3->batch()->delete_object($bucket, $node);
}, array($s3));
$responses = $s3->batch()->send();
var_dump($responses->areOK());
Older PHP 5.2.x:
$s3 = new AmazonS3();
$bucket = 'your-bucket';
$folder = 'folder/sub-folder/';
$s3->get_object_list($bucket, array(
'prefix' => $folder
))->each('construct_batch_delete', array($s3));
function construct_batch_delete($node, $i, &$s3)
{
$s3->batch()->delete_object($bucket, $node);
}
$responses = $s3->batch()->send();
var_dump($responses->areOK());
I have implemented this in Yii as,
$aws = Yii::$app->awssdk->getAwsSdk();
$s3 = $aws->createS3();
$s3->deleteMatchingObjects('Bucket Name','object key');

Categories