I'm trying to upload a file to Amazon S3 via Laravel 4.
After user submit a form, the file will be passed to a function where I need to use Amazon PHP SDK and upload the file to Amazon S3 bucket.
But how do I upload the file straight away to Amazon S3 without saving the file onto server.
My current code looks like this,
private function uploadVideo($vid){
$file = $vid;
$filename = $file->getClientOriginalName();
if (!class_exists('S3'))require_once('S3.php');
if (!defined('awsAccessKey')) define('awsAccessKey', '123123123');
if (!defined('awsSecretKey')) define('awsSecretKey', '123123123');
$s3 = new S3(awsAccessKey, awsSecretKey);
$s3->putBucket("mybucket", S3::ACL_PUBLIC_READ);
$s3->putObject($vid, "mybucket",$filename , S3::ACL_PUBLIC_READ);
}
Grab the official SDK from http://docs.aws.amazon.com/aws-sdk-php/latest/index.html
This example uses http://docs.aws.amazon.com/aws-sdk-php/latest/class-Aws.S3.S3Client.html#_upload
require('aws.phar');
use Aws\S3\S3Client;
use Aws\Common\Enum\Region;
// Instantiate the S3 client with your AWS credentials and desired AWS region
$client = S3Client::factory(array(
'key' => 'KEY HERE',
'secret' => 'SECRET HERE',
'region' => Region::AP_SOUTHEAST_2 // you will need to change or remove this
));
$result = $client->upload(
'BUCKET HERE',
'OBJECT KEY HERE',
'STRING OF YOUR FILE HERE',
'public-read' // public access ACL
);
Related
I am dynamically generating a PDF with DOMPDF in Laravel and I have to save the document to Amazon's Cloud service.
I am unable to save the document to the cloud. But I can only save locally.
$pdf = PDF::loadView('contract.contract-pdf', $ data);
$pdf->save('contract.pdf');
$s3 =\Storage::disk('s3');
$s3->put('pdf/contract', new File('contract.pdf'), 'public');
How to send it to cloud?
you can do like that
put pdf output(content) on storage
$pdf = PDF::loadView('contract.contract-pdf', $data);
\Storage::disk('s3')->put('pdf/contract.pdf', $pdf->output(), 'public');
After you have saved the file on your server, You'd need to call putObject API on the instance of Aws\S3\S3Client $s3
$result = $s3->putObject([
'Bucket' => '-- bucket name --',
'Key' => $key,
'Body' => 'this is the body!',
'SourceFile' => '/path/to/the/file.png'
]);
For more: https://medium.com/#kentaguilar/simplest-way-to-upload-a-file-to-aws-s3-via-php-e83a9f54ba77
I have a web application in PHP up and running. I want this app capable of uploading images to AWS s3 bucket. I am checking the documentation at AWS, but found at least three different documentations for this purpose. But still I am not clear, is possible that my web app hosted with a different hosting service will be able to upload files to AWS ?
If yes, which is the best option ?
You should be able to upload from outside of the AWS network.
Use the AWS PHP SDK at https://aws.amazon.com/sdk-for-php/
Then use the following code:
<?php
require 'vendor/autoload.php';
use Aws\Common\Exception\MultipartUploadException;
use Aws\S3\MultipartUploader;
use Aws\S3\S3Client;
$bucket = '*** Your Bucket Name ***';
$keyname = '*** Your Object Key ***';
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1'
]);
// Prepare the upload parameters.
$uploader = new MultipartUploader($s3, '/path/to/large/file.zip', [
'bucket' => $bucket,
'key' => $keyname
]);
// Perform the upload.
try {
$result = $uploader->upload();
echo "Upload complete: {$result['ObjectURL']}" . PHP_EOL;
} catch (MultipartUploadException $e) {
echo $e->getMessage() . PHP_EOL;
}
?>
Edit the bucket name, keyname, region and upload file name.
This is the multi-part upload style so you can upload huge files.
I'm trying to upload a file to my Google Cloud Bucket, however I'm not sure how to include the credentials. I have a .json file containing the credentials, and can create a ServiceBuilder, but I don't know how to then use this to upload the file (I've been using this: https://github.com/GoogleCloudPlatform/google-cloud-php#google-cloud-storage-ga)
The following returns a 401: Invalid Credentials:
<?php
require 'vendor/autoload.php';
use Google\Cloud\Storage\StorageClient;
use Google\Cloud\Core\ServiceBuilder;
// Authenticate using a keyfile path
$cloud = new ServiceBuilder([
'keyFilePath' => 'keyfile.json'
]);
$storage = new StorageClient([
'projectId' => 'storage-123456'
]);
$bucket = $storage->bucket('storage-123456');
// Upload a file to the bucket.
$bucket->upload(
fopen('data/file.txt', 'r')
);
// Using Predefined ACLs to manage object permissions, you may
// upload a file and give read access to anyone with the URL.
$bucket->upload(
fopen('data/file.txt', 'r'),
[
'predefinedAcl' => 'publicRead'
]
);
// Download and store an object from the bucket locally.
$object = $bucket->object('file_backup.txt');
$object->downloadToFile('data/file_backup.txt');
?>
Ah I was being thick.
Theres no need for the service builder:
$storage = new StorageClient([
'keyFilePath' => 'keyfile.json',
'projectId' => 'storage-123456'
]);
Is it possible to convert from private s3 files in bucket to public using PHP library provided by Amazon AWS S3?
All you need to do is set the ACL to public-read, you can do this with the PHP SDK using the update_object() function.
$s3 = new AmazonS3();
$bucket = 'my-bucket' . strtolower($s3->key);
$response = $s3->update_object($bucket, 'test1.txt', array(
'acl' => AmazonS3::ACL_PUBLIC
));
Source
I created a new amazon bucket called "photos". The bucket url is something like:
www.amazons3.salcaiser.com/photos
Now I upload subfolders containing files, into that bucket for example
www.amazons3.salcaiser.com/photos/thumbs/file.jpg
My questions are, does thumbs/ is assumed a new bucket or is it an object?
Then if I want to delete the entire thumbs/ directory need I first to delete all files inside that or can I delete all in one time?
In the case you are describing, "photos" is the bucket. S3 does not have sub-buckets or directories. Directories are simulated by using slashes in the object key. "thumbs/file.jpg" is an object key and "thumbs/" would be considered a key prefix.
Dagon's examples are good and use the older version 1.x of the AWS SDK for PHP. However, you can do this more easily with the newest 2.4.x version AWS SDK for PHP which includes a helper method for deleting multiple objects.
<?php
// Include the SDK. This line depends on your installation method.
require 'aws.phar';
use Aws\S3\S3Client;
$s3 = S3Client::factory(array(
'key' => 'your-aws-access-key',
'secret' => 'your-aws-secret-key',
));
// Delete the objects in the "photos" bucket with the a prefix of "thumbs/"
$s3->deleteMatchingObjects('photos', 'thumbs/');
//Include s3.php file first in code
if (!class_exists('S3'))
require_once('S3.php');
//AWS access info
if (!defined('awsAccessKey'))
define('awsAccessKey', 'awsAccessKey');
if (!defined('awsSecretKey'))
define('awsSecretKey', 'awsSecretKey');
//instantiate the class
$s3 = new S3(awsAccessKey, awsSecretKey);
if ($s3->deleteObject("bucketname", `filename`)) {
echo 'deleted';
}
else
{
echo 'no file found';
}
found some code snippets for 'directory' deletion - i did not write them:
PHP 5.3+:
$s3 = new AmazonS3();
$bucket = 'your-bucket';
$folder = 'folder/sub-folder/';
$s3->get_object_list($bucket, array(
'prefix' => $folder
))->each(function($node, $i, $s3) {
$s3->batch()->delete_object($bucket, $node);
}, array($s3));
$responses = $s3->batch()->send();
var_dump($responses->areOK());
Older PHP 5.2.x:
$s3 = new AmazonS3();
$bucket = 'your-bucket';
$folder = 'folder/sub-folder/';
$s3->get_object_list($bucket, array(
'prefix' => $folder
))->each('construct_batch_delete', array($s3));
function construct_batch_delete($node, $i, &$s3)
{
$s3->batch()->delete_object($bucket, $node);
}
$responses = $s3->batch()->send();
var_dump($responses->areOK());
I have implemented this in Yii as,
$aws = Yii::$app->awssdk->getAwsSdk();
$s3 = $aws->createS3();
$s3->deleteMatchingObjects('Bucket Name','object key');