Laravel 5.1 AWS S3 Storage, how to link images? - php

i am in the proccess of creating a "Content Management System" for a "start up company". I have a Post.php model in my project, the following code snippet is taken from the Create method:
if(Request::file('display_image') != null){
Storage::disk('s3')->put('/app/images/blog/'.$post->slug.'.jpg', file_get_contents(Request::file('display_image')));
$bucket = Config::get('filesystems.disks.s3.bucket');
$s3 = Storage::disk('s3');
$command = $s3->getDriver()->getAdapter()->getClient()->getCommand('GetObject', [
'Bucket' => Config::get('filesystems.disks.s3.bucket'),
'Key' => '/app/images/blog/'.$post->slug.'.jpg',
'ResponseContentDisposition' => 'attachment;'
]);
$request = $s3->getDriver()->getAdapter()->getClient()->createPresignedRequest($command, '+5 minutes');
$image_url = (string) $request->getUri();
$post->display_image = $image_url;
The above code checks if there is a "display_image" file input in the request object.
If it finds a file it uploads it directly to AWS S3 storage. I want to save the link of the file in the Database, so i can link it later in my views.
Hence i use this piece of code:
$request = $s3->getDriver()->getAdapter()->getClient()->createPresignedRequest($command, '+5 minutes');
$image_url = (string) $request->getUri();
$post->display_image = $image_url;
I get a URL, the only problem is that whenever i visit the $post->display_image URL i get a 403 permission denied. Obviously no authentication takes place when using the URL of the image.
How to solve this? I need to be able to link all my images/files from amazon S3 to the front-end interface of the website.

You could open up those S3 URLs to public viewing, but you probably wouldn't want to. You have to pay for the outgoing bandwidth every time someone views one of those images.
You might want to check out Glide, a pretty simple-to-use image library that supports S3. Make sure to reduce the load requirements on your server and wallet by setting caching headers on the images you serve.
Alternatively, you could use a CloudFront distribution as a caching proxy in front of your S3 bucket.

Related

Get all Folders and Images from Google Cloud Storage with Laravel

I am using laravel-google-cloud-storage to store images and retrieve them one by one. Is it possible that I can get all the folders and images from the Google Cloud Storage? If possible, how do I get this done?
I was trying to use this flysystem-google-cloud-storage to retrieve it but they are similar to the first link I have provided.
What I want to achieve is I want to select an image using the Google Cloud Storage with all the folders and images in it and put it in my form instead of selecting an image from my local.
UPDATE:
This is what I have tried so far base from this documentation.
$storageClient = new StorageClient([
'projectId' => 'project-id',
'keyFilePath' => 'myKeyFile.json',
]);
$bucket = $storageClient->bucket('my-bucket');
$buckets = $storageClient->buckets();
Then tried adding foreach which returns empty and also I have 6 folders in my Bucket.
foreach ($buckets as $bucket) {
dd($bucket->name());
}
It's been a week since my post has not been answered. I'll just post and share to anyone of what I did since last week.
I am using Laravel 5.4 at the moment.
So I installed laravel-google-cloud-storage and flysystem-google-cloud-storage in my application.
I created a Different controller since I am retrieving the images from Google Cloud Storage via Ajax.
All you need to do is to get your Google Cloud Storage credentials which can be located in your Google Cloud Storage Dashboard > Look for the APIs then click the link below that stated "Go to APIs overview > Credentials. Just download the credentials which is in JSON file format and put it in your root or anywhere you wanted to (I still don't know where should I properly put this file). Then the next is we get your Google Cloud Storage Project ID which can be located in the Dashboard.
Then this is my setup in my controller that connects from my Laravel application to Google Cloud Storage which I am able to Upload, Retrieve, Delete, Copy files.
use Google\Cloud\Storage\StorageClient;
use League\Flysystem\Filesystem;
use League\Flysystem\Plugin\GetWithMetadata;
use Superbalist\Flysystem\GoogleStorage\GoogleStorageAdapter;
class GoogleStorageController extends Controller
{
// in my method
$storageClient = new StorageClient([
'projectId' => 'YOUR-PROJECT-ID',
'keyFilePath' => '/path/of/your/keyfile.json',
]);
// name of your bucket
$bucket = $storageClient->bucket('your-bucket-name');
$adapter = new GoogleStorageAdapter($storageClient, $bucket);
$filesystem = new Filesystem($adapter);
// this line here will retrieve all your folders and images
$contents = $filesystem->listContents();
// you can get the specific directory and the images inside
// by adding a parameter
$contents = $filesystem->listContents('directory-name');
return response()->json([
'contents' => $contents
]);
}

Laravel Download from S3 To Local

I am trying to download a file that I stored on S3 to my local Laravel installation to manipulate it. Would appreciate some help.
I have the config data set up correctly because I am able to upload it without any trouble. I am saving it in S3 with following pattern "user->id / media->id.mp3" --> note the fact that I am not just dumping files on S3, I am saving them in directories.
After successfully uploading the file to S3 I update the save path in my DB to show "user->id / media->id.mp3", not some long public url (is that wrong)?
When I later go back to try and download the file I am getting a FileNotFoundException at S3. I'm doing this.
$audio = Storage::disk('s3')->get($media->location);
The weird thing is that in the exception it shows the resource that it cannot fetch but when I place that same url in a browser it displays the file without any trouble at all. Why can't the file system get the file?
I have tried to do a "has" check before the "get" and the has check comes up false.
Do I need to save the full public URL in the database for this to work? I tried that and it didn't help. I feel like I am missing something very simple and it is making me crazy!!
Late answer but important for others,
$s3_file = Storage::disk('s3')->get(request()->file);
$s3 = Storage::disk('public');
$s3->put("./file_name.tif", $s3_file);
The response of $s3_file will be a stream, you can save that stream data to file using Laravel put file method, you will find this stream file in storage/public directory.
You can give your Content-Type as desired and Content-Disposition as 'attachment' because your files are coming from S3 and you have to download it as an attachment.
$event_data = $this->ticket->where('user_id', $user_id)->first();
$data = $event_data->pdf;
$get_ticket = 'tickets/'. $data;
$file_name = "YOUR_DESIRED_NAME.pdf";
$headers = [
'Content-Type' => 'application/pdf',
'Content-Disposition' => 'attachment; filename="'. $file_name .'"',
];
return \Response::make(Storage::disk('s3')->get($get_ticket), 200, $headers);
Say, you have AWS S3 as your default storage.
And you want to download my_file.txt from S3 to my_laravel_project\storage\app\my_file.txt
And you want to make it a one-liner
Storage::disk('local')->put('my_file.txt', Storage::get('my_file.txt'));

S3 Force Download

I have a website here http://www.voclr.it/acapellas/ my files are hosting on my Amazon S3 Account, but when a visitor goes to download an MP3 from my website it forces them to stream it but what I actually want it to do is download it to there desktop.
I have disabled S3 on the website for now, so the downloads are working fine. but really I want S3 to search the MP3s
Basically, you have to tell S3 to override the content-disposition header of the response. You can do that by appending the response-content-disposition query string parameter to the S3 file url and setting it to the desired content-disposition value. To force download try:
<url>&response-content-disposition="attachment; filename=somefilename"
You can find this in the S3 docs. For information on the values that the content-disposition header can assume you can look here.
As an additional information this also works with Google Cloud Storage.
require_once '../sdk-1.4.2.1/sdk.class.php';
// Instantiate the class
$s3 = new AmazonS3();
// Copy object over itself and modify headers
$response = $s3->copy_object(
array( // Source
'bucket' => 'your_bucket',
'filename' => 'Key/To/YourFile'
),
array( // Destination
'bucket' => 'your_bucket',
'filename' => 'Key/To/YourFile'
),
array( // Optional parameters
**'headers' => array(
'Content-Type' => 'application/octet-stream',
'Content-Disposition' => 'attachment'**
)
)
);
// Success?
var_dump($response->isOK());
Amazon AWS S3 to Force Download Mp3 File instead of Stream It
I created a solution for doing this via CloudFront functions (no php required since it all runs at AWS by linking to the .mp3 file on Cloudfront with a ?title=TitleGoesHere querystring to force a downloaded file with that filename). This is a fairly recent way of doing things (as of August 2022). I documented my function and how I set up my "S3 bucket behind a CloudFront Distribution" here: https://stackoverflow.com/a/73451456/19823883

AWS upload to S3 with SQS - PHP syntax

Would uploading to S3 using SQS make the process more fault tolerant?
If so, i am having a hard time with syntax, trying to combine creating a queue then uploading to S3.If my logic is not correct, how would i set up a system using SQS to upload to S3?
if (!class_exists('S3'))require_once('S3.php');
// *these keys are random strings
$AWS_KEY = "6VVWTU4JDAAKHYB1C3ZN";
$AWS_SECRET_KEY = "GMSCUD8C0QA1QLV9Y3RP2IAKDIZSCHRGKEJSXZ4F";
//AWS access info
if (!defined('awsAccessKey')) define('awsAccessKey', $AWS_KEY);
if (!defined('awsSecretKey')) define('awsSecretKey', $AWS_SECRET_KEY);
//instantiate the class
$s3 = new S3(awsAccessKey, awsSecretKey);
//check whether a form was submitted
if(isset($_POST['Submit'])){
//retreive post variables
$fileName = $_FILES['theFile']['name'];
$fileTempName = $_FILES['theFile']['tmp_name'];
//create a new bucket
$s3->putBucket("mybucket", S3::ACL_PUBLIC_READ);
//add the queue
$sqs = new AmazonSQS(array( "key" => $AWS_KEY, "secret" => $AWS_SECRET_KEY ));
$response = $sqs->create_queue('test-topic-queue');
$queue_url = (string) $response->body->CreateQueueResult->QueueUrl;
$queue_arn = 'arn:aws:sqs:us-east-1:ENCQ8gqrAcXv:test-topic-queue';
//$queue_url . ?Action=SendMessage&MessageBody=Your%20Message%20Text?&AWSAccessKeyId=AKIAIOSFODNN7EXAMPLE&Version=2011-10-01?&Expires=2008-02-10T12:00:00Z?&Signature=lBP67vCvGlDMBQ1do?fZxg8E8SUEXAMPLE&SignatureVersion=2&SignatureMethod=HmacSHA256
// HOW DO I INCORPORATE SQS AND S3
//move the file
if ($s3->putObjectFile($fileTempName,
"mybucket",
"myFolder/" . $fileName, S3::ACL_PUBLIC_READ,
array(),
$_FILES['theFile']['type']) ) {
//it works
}else{
// error
}
}
I'm not exactly understanding the fault tolerance you are requesting. But in terms of using S3 and SQS for scaling, there is an excellent paper on the Amazon AWS website that talks about scaling up and down your infrastructure using SQS and EC2 together, which can of course include processes like uploading to S3 and using SQS to tell the application to process something. You don't mention whether or not you're using EC2 or if this is of interest.
Here is the article: http://aws.amazon.com/articles/1464
Otherwise, it sounds like your logic may be confused as SQS isn't an in-between from server to S3, but rather more for application messaging.
I think i figured out the OP's confusion on this topic.
The diagram shown makes it appear that SQS is handling uploads, but its really not, when you upload 1 or 100 photo's, they're added to S3 directly, then using SQS it creates a "Task" which one of the EC2 "Processing Servces" will pull, and then grab the actual picture from the S3 storage that as named in the SQS message.
Hopefully this gives some insight for future users who see this question feeling lost.

Streaming private videos from Amazon S3

I need to display videos / images file with ACL:PRIVATE uploaded to my Amazon S3 account on my wordpress blog.
I am a newbie to PHP oops based coding. Any script help, link references, free plugins or even Logical Algorithm will be great help :)
Thanks in advance.
This issue could be solved by implementing the following steps:
Download latest stable version of SDK from here
Extract the .zip file & place in wamp/www folder
Rename config-sample.inc.php file to config.inc.php
Add the access key & secret key (retrieved from Amazon S3 account) into above file, save & exit
create a sample file to display public / private objects from Amazon S3
The content of the file should look as follows:
require('sdk.class.php');
require('services/s3.class.php');
$s3 = new AmazonS3();
$bucket = "bucketname";
$temp_link = $s3->get_object_url($bucket, 'your/folder/path/img.jpg', '5 minute');
echo $temp_link;
In above code, the URL you receive as output is a signed URL for your private object, thus it is valid only for 5 minutes.
You may grant access for a future date and allow only authorized users to access your private content or media on Amazon S3.
This question is a little bit old, but I'm posting it anyway. I had a simliar issue today and found out there's a simple answer.
aws doc explains it clearly and has an example as well.
http://docs.aws.amazon.com/aws-sdk-php-2/guide/latest/service-s3.html#amazon-s3-stream-wrapper
Basically, you need to register AWS' stream wrapper and use s3:// protocol.
Here's my code sample.
use Aws\Common\Aws;
use Aws\S3\Enum\CannedAcl;
use Aws\S3\Exception\S3Exception;
$s3 = Aws::factory(array(
'key' => Config::get('aws.key'),
'secret' => Config::get('aws.secret'),
'region' => Config::get('aws.region')
))->get('s3');
$s3->registerStreamWrapper();
// now read file from s3
// from the doc.
// Open a stream in read-only mode
if ($stream = fopen('s3://bucket/key', 'r')) {
// While the stream is still open
while (!feof($stream)) {
// Read 1024 bytes from the stream
echo fread($stream, 1024);
}
// Be sure to close the stream resource when you're done with it
fclose($stream);
}

Categories