How to upload dynamically generated PDF to AWS with Laravel? - php

I am dynamically generating a PDF with DOMPDF in Laravel and I have to save the document to Amazon's Cloud service.
I am unable to save the document to the cloud. But I can only save locally.
$pdf = PDF::loadView('contract.contract-pdf', $ data);
$pdf->save('contract.pdf');
$s3 =\Storage::disk('s3');
$s3->put('pdf/contract', new File('contract.pdf'), 'public');
How to send it to cloud?

you can do like that
put pdf output(content) on storage
$pdf = PDF::loadView('contract.contract-pdf', $data);
\Storage::disk('s3')->put('pdf/contract.pdf', $pdf->output(), 'public');

After you have saved the file on your server, You'd need to call putObject API on the instance of Aws\S3\S3Client $s3
$result = $s3->putObject([
'Bucket' => '-- bucket name --',
'Key' => $key,
'Body' => 'this is the body!',
'SourceFile' => '/path/to/the/file.png'
]);
For more: https://medium.com/#kentaguilar/simplest-way-to-upload-a-file-to-aws-s3-via-php-e83a9f54ba77

Related

PHP AWS S3 sdk: How to generate pre signed url to upload a file to a folder in the S3 bucket using PHP?

I am trying to create a pre signed url to upload a file to a folder in the S3 bucket using PHP. I am able to generate url to upload in bucket, but couldn't figure out where to mention folder name. Below is my code.
$object = 'test_103.jpg';
$bucket = $config['s3_input']['bucket'];
$expiry = new DateTime('+10 minutes');
$command = $s3_input->getCommand(
'PutObject',
[
'Bucket' => $bucket,
'Key' => $object
]
);
$signedRequest = $s3_input->createPresignedRequest($command,'+10 minutes');
$signedUploadUrl = $signedRequest->getUri();
echo $signedUploadUrl;
In above code how do I pass folder name in which I want to create pre signed url?
S3 does not have the hierarchy/directory structure, so keep that in mind, It becomes important when later you want to "move" or "rename" a "folder" on S3
https://serverfault.com/questions/435827/what-is-the-difference-between-buckets-and-folders-in-amazon-s3
Direct answer for your question : Add the slashes "/" in your "Key" to achieve the effect you want
Also, from the document
$commands[] = $s3Client->getCommand('PutObject', array(
'Bucket' => 'SOME_BUCKET',
'Key' => 'photos/photo01.jpg',
'Body' => fopen('/tmp/photo01.jpg', 'r'),
));

Upload image to AWS bucket from remote server

I have a web application in PHP up and running. I want this app capable of uploading images to AWS s3 bucket. I am checking the documentation at AWS, but found at least three different documentations for this purpose. But still I am not clear, is possible that my web app hosted with a different hosting service will be able to upload files to AWS ?
If yes, which is the best option ?
You should be able to upload from outside of the AWS network.
Use the AWS PHP SDK at https://aws.amazon.com/sdk-for-php/
Then use the following code:
<?php
require 'vendor/autoload.php';
use Aws\Common\Exception\MultipartUploadException;
use Aws\S3\MultipartUploader;
use Aws\S3\S3Client;
$bucket = '*** Your Bucket Name ***';
$keyname = '*** Your Object Key ***';
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1'
]);
// Prepare the upload parameters.
$uploader = new MultipartUploader($s3, '/path/to/large/file.zip', [
'bucket' => $bucket,
'key' => $keyname
]);
// Perform the upload.
try {
$result = $uploader->upload();
echo "Upload complete: {$result['ObjectURL']}" . PHP_EOL;
} catch (MultipartUploadException $e) {
echo $e->getMessage() . PHP_EOL;
}
?>
Edit the bucket name, keyname, region and upload file name.
This is the multi-part upload style so you can upload huge files.

Upload to Google Cloud Bucket - Invalid Credentials

I'm trying to upload a file to my Google Cloud Bucket, however I'm not sure how to include the credentials. I have a .json file containing the credentials, and can create a ServiceBuilder, but I don't know how to then use this to upload the file (I've been using this: https://github.com/GoogleCloudPlatform/google-cloud-php#google-cloud-storage-ga)
The following returns a 401: Invalid Credentials:
<?php
require 'vendor/autoload.php';
use Google\Cloud\Storage\StorageClient;
use Google\Cloud\Core\ServiceBuilder;
// Authenticate using a keyfile path
$cloud = new ServiceBuilder([
'keyFilePath' => 'keyfile.json'
]);
$storage = new StorageClient([
'projectId' => 'storage-123456'
]);
$bucket = $storage->bucket('storage-123456');
// Upload a file to the bucket.
$bucket->upload(
fopen('data/file.txt', 'r')
);
// Using Predefined ACLs to manage object permissions, you may
// upload a file and give read access to anyone with the URL.
$bucket->upload(
fopen('data/file.txt', 'r'),
[
'predefinedAcl' => 'publicRead'
]
);
// Download and store an object from the bucket locally.
$object = $bucket->object('file_backup.txt');
$object->downloadToFile('data/file_backup.txt');
?>
Ah I was being thick.
Theres no need for the service builder:
$storage = new StorageClient([
'keyFilePath' => 'keyfile.json',
'projectId' => 'storage-123456'
]);

How to upload mpdf file after generating to s3 bucket in php

Can I upload mpdf file to s3 server after generating.
$file_name = $pdf->Output(time().'_'.'E-Prescription.pdf','F');
Assuming you have the AWS SDK installed in your project using composer; specifically...
composer require aws/aws-sdk-php
Yes you can, using the stream wrapper like this:
require "vendor/autoload.php";
$aws_file = 's3://bucketname/foldername/your_file_name.pdf';
//the folder is optional if you have one within your bucket
try {
$s3->registerStreamWrapper();
$mpdf->Output($aws_file, \Mpdf\Output\Destination::FILE);
}
catch (S3Exception $e) {
$data['error'] = $e->getMessage();
//show the error as a JSON callback that you can use for troubleshooting
echo json_encode($data);
exit();
}
You might have to add write permissions to your web server as follows (using Apache server on Ubuntu AWS EC2):
sudo chown -R www-data /var/www/html/vendor/mpdf/mpdf/src/Config/tmp
sudo chmod -R 755 /var/www/html/vendor/mpdf/mpdf/src/Config/tmp
Then edit the ConfigVariables.php file found at:
\vendor\mpdf\mpdf\src\Config
Change:
'tempDir' => __DIR__ . '/../../tmp',
To:
'tempDir' => __DIR__ . '/tmp',
Then create an empty folder named 'tmp' in that same directory. Then upload with joy.
// Set yours config's
define("AWS_S3_KEY", "<your_key_here>");
define("AWS_S3_SECRET", "<your_secret_here>");
define("AWS_S3_REGION", "<your_region_here example:us-east-1>");
define("AWS_S3_BUCKET", "<your_bucket_folder_name_here>");
try {
/*
doc: https://github.com/mpdf/mpdf
url/download: https://github.com/mpdf/mpdf/archive/development.zip
*/
require_once 'mpdf/mpdf.php'; // load yout mdf libe
$mpdf = new mPDF(); // set init object mPDF
$nomeArquivo = md5('cliente_01'); // set file name and cripty this
$mpdf->WriteHTML("Teste upload PDF in s3 bucket");
/*
doc: https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/getting-started_installation.html
url/download: https://docs.aws.amazon.com/aws-sdk-php/v3/download/aws.zip
*/
require_once 'aws/aws-autoloader.php'; // set locate yout lib AWS
$aws_file = 's3://'.AWS_S3_BUCKET.'/'.$nomeArquivo.'.pdf';
$s3 = new Aws\S3\S3Client([
'region' => AWS_S3_REGION,
'version' => 'latest',
'credentials' => [
'key' => AWS_S3_KEY,
'secret' => AWS_S3_SECRET,
]
]);
$s3->registerStreamWrapper();
$mpdf->Output($aws_file); //Send yout mPDF File in s3-file-bucket
} catch (S3Exception $e) {
die($e->getError().' => '.$e->getMessage();
}
To do this you could use the AWS SDK for PHP.
First you will need to create a client using your profile credentials.
use Aws\S3\S3Client;
$client = S3Client::factory(array(
'credentials' => array(
'key' => 'YOUR_AWS_ACCESS_KEY_ID',
'secret' => 'YOUR_AWS_SECRET_ACCESS_KEY',
)
));
And, if the bucket already exists, you can upload your file from the file system like this:
$result = $client->putObject(array(
'Bucket' => $bucket,
'Key' => $file_name,
'SourceFile' => $pathToFile
));

uploading posted file to amazon s3

I'm trying to upload a file to Amazon S3 via Laravel 4.
After user submit a form, the file will be passed to a function where I need to use Amazon PHP SDK and upload the file to Amazon S3 bucket.
But how do I upload the file straight away to Amazon S3 without saving the file onto server.
My current code looks like this,
private function uploadVideo($vid){
$file = $vid;
$filename = $file->getClientOriginalName();
if (!class_exists('S3'))require_once('S3.php');
if (!defined('awsAccessKey')) define('awsAccessKey', '123123123');
if (!defined('awsSecretKey')) define('awsSecretKey', '123123123');
$s3 = new S3(awsAccessKey, awsSecretKey);
$s3->putBucket("mybucket", S3::ACL_PUBLIC_READ);
$s3->putObject($vid, "mybucket",$filename , S3::ACL_PUBLIC_READ);
}
Grab the official SDK from http://docs.aws.amazon.com/aws-sdk-php/latest/index.html
This example uses http://docs.aws.amazon.com/aws-sdk-php/latest/class-Aws.S3.S3Client.html#_upload
require('aws.phar');
use Aws\S3\S3Client;
use Aws\Common\Enum\Region;
// Instantiate the S3 client with your AWS credentials and desired AWS region
$client = S3Client::factory(array(
'key' => 'KEY HERE',
'secret' => 'SECRET HERE',
'region' => Region::AP_SOUTHEAST_2 // you will need to change or remove this
));
$result = $client->upload(
'BUCKET HERE',
'OBJECT KEY HERE',
'STRING OF YOUR FILE HERE',
'public-read' // public access ACL
);

Categories