I have a web page on a web hosting and images are stored on Amazon S3. I want with php be able to download multiple images from Amazon S3 through my web page in a zip file.
What are my options and what is the best?
What I know, it is not possible to compress files on S3. Can I use Amazon lambda?
Best solution I've come across.
The user selects on my website which images they want to downloaded.
I get the file name from my database on my web host and download the
images from S3 to a temporary directory on my web host.
A zip file is created in a temporary directory and a link is sent
to the user. After a certain time, I clear up the temporary directory (with a script) on my web host.
But it would be great if there are a way that did not go through my hosting to create and download the zip-file.
AWS S3 is "basic building blocks", so it doesn't support a feature like zipping multiple objects together.
You've come up with a good method to do it, though you could stream the objects into a zip file rather than downloading them. EC2 instances can do this very quickly because they tend to have fast connections to S3.
Lambda doesn't work for this, as it is only triggered when an object is placed into an S3 bucket. You are doing the opposite.
Related
I would like to store my moodledata folder on S3 bucket and share with multiple EC2 instances for autoscaling.
So without using S3fs, I want to manage this moodledata on S3 bucket using AWS SDK for PHP.
Also, I have searched many forums but I did not get any solution with AWS SDK for PHP. Regards,
There is a plugin https://moodle.org/plugins/tool_objectfs which almost does this. It does not replace the file system but shifts files to S3 based on critera. Example if file size is greater than 10kb. You will still need a shared file system but would be able to keep the cost down by shifting old and large files off to S3. In AWS the best way to have a shared file system is EFS service. It would be challenging to completely replace the file system with S3 as S3 is not a file system but an object store.
We have developed our application in Laravel and now we are planning to transfer it to Amazon server where we have to separate our application logic with file storage. Basically, we want to move our whole application storage to a cloud server(Amazon S3) and application logic to Amazon EC2 server.
In our system, we manipulate (resize images, merge images, make thumbnails from videos, etc) many storage files locally. We will not be going to store any files on the application server once we migrate to Amazon server. So, our concern is how we can manipulate cloud server files?
Earlier all files are present on the application server so file manipulation was easy to process but after migrating whole storage to cloud server how we can manipulate files that are on the cloud server with manipulation logic resides on the application server?
Any Response will be helpful
Thanks in advance...
To manipulate S3 file, I think first we need to download the file locally. Once, we have file locally we can apply any operation on that particular file. We can delete the local file later.
Here are the documents to directly upload from or download to a local file using the Amazon S3.
https://aws.amazon.com/blogs/developer/transferring-files-to-and-from-amazon-s3/
https://docs.aws.amazon.com/aws-sdk-php/v3/guide/
Thanks
I have a dynamic website on which users can upload different files to be reviewed by different experts.
Users can upload/download their files to/from our server.
How can I use the Amazon S3 and PHP to expand my storage? I already have an AWS account and made a research on the net, but i still can't figure out How to make sure that every user can access only his files and the one he will upload will be in a specific bucket with his username?
You shouldn't create a separate bucket for each user. You should use one bucket per application/environment. To work with Amazon S3 you can use AWS SDK for PHP. Your file "structure" will look something like this: /{bucket}/{userid}/images/*, /{bucket}/{userid}/videos/*, etc.
I'm trying to figure out how I could do both, force downloads of MP3 files from my Amazon S3 bucket and also allow them to be opened by the browser.
How can I control a link to either force a download or play the MP3 in the browser?
Doing some research, I've seen examples to do one or the other, but I need to be able to do both for each file depending on which link is clicked.
I'm using PHP to script my site.
I am currently writing an application using the Yii-framework in PHP that stores a large number of files uploaded by the users of the application. I decided that since the number of files are going to be ever increasing, it would be beneficial to use Amazon S3 to store these files and when requested, the server could retrieve the files and send it to the user. (The server is an EC2 instance in the same zone)
Since the files are all confidential, the server has to verify the identity of the user and their credentials before allowing the user to receive the file. Is there a way to send the file to the user in this case directly from S3 or do I have to pull the data to the server first and then serve it to the user.
If So, Is there any way to cache the recently uploaded files on the local server so that it does not have to go to s3 to look for the file. In most cases, the most recently uploaded files will be requested repeatedly by multiple clients.
Any help would be greatly appreciated!
Authenticated clients can download files directly from S3 by signing the appropriate URLs on the server prior to displaying the page/urls to the client.
For more information, see: http://s3.amazonaws.com/doc/s3-developer-guide/RESTAuthentication.html
Note that for confidential files you may also want to consider server-side/client side encryption. Finally, for static files ( such as images ) you may want to set the appropriate cache headers as well.
Use AWS Cloud Front to server these static files. Rather than sending the files to the user, send them links to the files. The Links need to be cloud front links & not direct links to the S3 bucket.
This has the benefit of keeping load low on your server as well as caching files close to your users for better performance.
More details here Serving Private Content through CloudFront