Using PHP I want to upload some files to amazon s3 public bucket. But I don't want to use access key and secret key in my code. Because when I searched and found that using keyword "null" will solve the case. In PHP we use '' as null, but doesn't have luck. Below is my code
<?php
if (!class_exists('s3'))require_once('s3.php');
//AWS access info
if (!defined('awsAccessKey')) define('awsAccessKey', 'null');
if (!defined('awsSecretKey')) define('awsSecretKey', 'null');
//instantiate the class
$s3 = new S3(awsAccessKey, awsSecretKey);
$s3->putBucket("bucket name", S3::ACL_PUBLIC_READ_WRITE);
//move the file
if ($s3->putObjectFile("your file name in the server with path", "bucket name", "fine name in s3 server",S3::ACL_PUBLIC_READ_WRITE)){
//s3 upload success
}
Related
I am using laravel-google-cloud-storage to store images and retrieve them one by one. Is it possible that I can get all the folders and images from the Google Cloud Storage? If possible, how do I get this done?
I was trying to use this flysystem-google-cloud-storage to retrieve it but they are similar to the first link I have provided.
What I want to achieve is I want to select an image using the Google Cloud Storage with all the folders and images in it and put it in my form instead of selecting an image from my local.
UPDATE:
This is what I have tried so far base from this documentation.
$storageClient = new StorageClient([
'projectId' => 'project-id',
'keyFilePath' => 'myKeyFile.json',
]);
$bucket = $storageClient->bucket('my-bucket');
$buckets = $storageClient->buckets();
Then tried adding foreach which returns empty and also I have 6 folders in my Bucket.
foreach ($buckets as $bucket) {
dd($bucket->name());
}
It's been a week since my post has not been answered. I'll just post and share to anyone of what I did since last week.
I am using Laravel 5.4 at the moment.
So I installed laravel-google-cloud-storage and flysystem-google-cloud-storage in my application.
I created a Different controller since I am retrieving the images from Google Cloud Storage via Ajax.
All you need to do is to get your Google Cloud Storage credentials which can be located in your Google Cloud Storage Dashboard > Look for the APIs then click the link below that stated "Go to APIs overview > Credentials. Just download the credentials which is in JSON file format and put it in your root or anywhere you wanted to (I still don't know where should I properly put this file). Then the next is we get your Google Cloud Storage Project ID which can be located in the Dashboard.
Then this is my setup in my controller that connects from my Laravel application to Google Cloud Storage which I am able to Upload, Retrieve, Delete, Copy files.
use Google\Cloud\Storage\StorageClient;
use League\Flysystem\Filesystem;
use League\Flysystem\Plugin\GetWithMetadata;
use Superbalist\Flysystem\GoogleStorage\GoogleStorageAdapter;
class GoogleStorageController extends Controller
{
// in my method
$storageClient = new StorageClient([
'projectId' => 'YOUR-PROJECT-ID',
'keyFilePath' => '/path/of/your/keyfile.json',
]);
// name of your bucket
$bucket = $storageClient->bucket('your-bucket-name');
$adapter = new GoogleStorageAdapter($storageClient, $bucket);
$filesystem = new Filesystem($adapter);
// this line here will retrieve all your folders and images
$contents = $filesystem->listContents();
// you can get the specific directory and the images inside
// by adding a parameter
$contents = $filesystem->listContents('directory-name');
return response()->json([
'contents' => $contents
]);
}
I'm trying to write a script where I'm trying to move images from an old server to Amazon s3. Is it possible to do by downloading the image from an url?
$companies = Company::all();
$companies->each( function ($company) {
//some method to download file
$file = download($company->logo);
//Store on s3
$filename = $file->storeAs('images', uniqid('img_') . "." . $file->guessClientExtension(),'s3');
//Get the new path
$new_path = Storage::disk('s3')->url($filename);
//save the new path to logo
$company->logo = $new_path;
//save the new path
$company->save();
}
You can use this library Php League FileSystem
It has an integration for laravel and other framewoks such as zend, Cake etc
The library has an adapter for amazon S3 V2 and amazon s3 V3
Full documentation here
I am trying to fetch/retrieve files stored on AWS Glacier using PHP. But I am not able to find any method to do so.
What I want is just fetch/retrieve from AWS Glacier using PHP. If anyone have idea about it then please suggest me.
Thanks.
As per example from github you can retrieve a file using the following
// Use the us-west-2 region and latest version of each client.
$sharedConfig = [
'region' => 'us-west-2',
'version' => 'latest'
];
// Create an SDK class used to share configuration across clients.
$sdk = new Aws\Sdk($sharedConfig);
// Create an Amazon Glacier client using the shared configuration data.
$client = $sdk-> createGlacier();
//Download our archive from Amazon to our server
$result = $aws->getJobOutput(array(
'vaultName' => '<YOUR VAULT>', //The name of the vault
'jobId' => 'XXXX' //supply the unique ID of the job that retrieved the archive
));
$data = $result->get('body'); //Sets the file data to a variable
$description = $result->get('archiveDescription'); //Sets file description to a variable
//deletes the temp file on our server if it exists
if(file_exists("files/temp")){
unlink("files/temp");
}
$filepath = "files/temp";
$fp = fopen($filepath, "w"); //creates a new file temp file on our web server
fwrite($fp, $data); //write the data in our variable to our temp file
//Your archive is now ready for download on your web server
You can review PHP Glacier ref documentation for more details
Would uploading to S3 using SQS make the process more fault tolerant?
If so, i am having a hard time with syntax, trying to combine creating a queue then uploading to S3.If my logic is not correct, how would i set up a system using SQS to upload to S3?
if (!class_exists('S3'))require_once('S3.php');
// *these keys are random strings
$AWS_KEY = "6VVWTU4JDAAKHYB1C3ZN";
$AWS_SECRET_KEY = "GMSCUD8C0QA1QLV9Y3RP2IAKDIZSCHRGKEJSXZ4F";
//AWS access info
if (!defined('awsAccessKey')) define('awsAccessKey', $AWS_KEY);
if (!defined('awsSecretKey')) define('awsSecretKey', $AWS_SECRET_KEY);
//instantiate the class
$s3 = new S3(awsAccessKey, awsSecretKey);
//check whether a form was submitted
if(isset($_POST['Submit'])){
//retreive post variables
$fileName = $_FILES['theFile']['name'];
$fileTempName = $_FILES['theFile']['tmp_name'];
//create a new bucket
$s3->putBucket("mybucket", S3::ACL_PUBLIC_READ);
//add the queue
$sqs = new AmazonSQS(array( "key" => $AWS_KEY, "secret" => $AWS_SECRET_KEY ));
$response = $sqs->create_queue('test-topic-queue');
$queue_url = (string) $response->body->CreateQueueResult->QueueUrl;
$queue_arn = 'arn:aws:sqs:us-east-1:ENCQ8gqrAcXv:test-topic-queue';
//$queue_url . ?Action=SendMessage&MessageBody=Your%20Message%20Text?&AWSAccessKeyId=AKIAIOSFODNN7EXAMPLE&Version=2011-10-01?&Expires=2008-02-10T12:00:00Z?&Signature=lBP67vCvGlDMBQ1do?fZxg8E8SUEXAMPLE&SignatureVersion=2&SignatureMethod=HmacSHA256
// HOW DO I INCORPORATE SQS AND S3
//move the file
if ($s3->putObjectFile($fileTempName,
"mybucket",
"myFolder/" . $fileName, S3::ACL_PUBLIC_READ,
array(),
$_FILES['theFile']['type']) ) {
//it works
}else{
// error
}
}
I'm not exactly understanding the fault tolerance you are requesting. But in terms of using S3 and SQS for scaling, there is an excellent paper on the Amazon AWS website that talks about scaling up and down your infrastructure using SQS and EC2 together, which can of course include processes like uploading to S3 and using SQS to tell the application to process something. You don't mention whether or not you're using EC2 or if this is of interest.
Here is the article: http://aws.amazon.com/articles/1464
Otherwise, it sounds like your logic may be confused as SQS isn't an in-between from server to S3, but rather more for application messaging.
I think i figured out the OP's confusion on this topic.
The diagram shown makes it appear that SQS is handling uploads, but its really not, when you upload 1 or 100 photo's, they're added to S3 directly, then using SQS it creates a "Task" which one of the EC2 "Processing Servces" will pull, and then grab the actual picture from the S3 storage that as named in the SQS message.
Hopefully this gives some insight for future users who see this question feeling lost.
I have a website hosted on amazon. I want my clients to give access to upload files that are already in their amazon s3 space to my s3 space. Is there any php API that supports this functionality?
Amazon actually provides one. And there are lots of examples on the web of using it. Google is your friend.
Amazon have a PHPSDK , check the sample code
// The sample code below demonstrates how Resource APIs work
$aws = new Aws($config);
// Get references to resource objects
$bucket = $aws->s3->bucket('my-bucket');
$object = $bucket->object('image/bird.jpg');
// Access resource attributes
echo $object['LastModified'];
// Call resource methods to take action
$object->delete();
$bucket->delete();
Or use old s3.php for uploading files to s3 bucket. its a single php file named s3.php
You just download that and from your code . for more read this.
<?php
if (!class_exists('S3'))require_once('S3.php');
//AWS access info
if (!defined('awsAccessKey')) define('awsAccessKey', 'YourAccess S3 Key');
if (!defined('awsSecretKey')) define('awsSecretKey', 'Yor Secret Key');
//instantiate the class
$s3 = new S3(awsAccessKey, awsSecretKey);
$s3->putBucket("bucket name", S3::ACL_PRIVATE);
//move the file
if ($s3->putObjectFile("your file name in the server with path", "which bucket ur using (bucket name)", "fine name in s3 server", S3::ACL_PRIVATE)) {
//s3 upload success
}
?>